Sep 30 06:02:24 localhost kernel: Linux version 5.14.0-617.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-11), GNU ld version 2.35.2-67.el9) #1 SMP PREEMPT_DYNAMIC Mon Sep 15 21:46:13 UTC 2025
Sep 30 06:02:24 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Sep 30 06:02:24 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-617.el9.x86_64 root=UUID=d6a81468-b74c-4055-b485-def635ab40f8 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Sep 30 06:02:24 localhost kernel: BIOS-provided physical RAM map:
Sep 30 06:02:24 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Sep 30 06:02:24 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Sep 30 06:02:24 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Sep 30 06:02:24 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Sep 30 06:02:24 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Sep 30 06:02:24 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Sep 30 06:02:24 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Sep 30 06:02:24 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Sep 30 06:02:24 localhost kernel: NX (Execute Disable) protection: active
Sep 30 06:02:24 localhost kernel: APIC: Static calls initialized
Sep 30 06:02:24 localhost kernel: SMBIOS 2.8 present.
Sep 30 06:02:24 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Sep 30 06:02:24 localhost kernel: Hypervisor detected: KVM
Sep 30 06:02:24 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Sep 30 06:02:24 localhost kernel: kvm-clock: using sched offset of 3960028420 cycles
Sep 30 06:02:24 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Sep 30 06:02:24 localhost kernel: tsc: Detected 2800.000 MHz processor
Sep 30 06:02:24 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Sep 30 06:02:24 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Sep 30 06:02:24 localhost kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Sep 30 06:02:24 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Sep 30 06:02:24 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Sep 30 06:02:24 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Sep 30 06:02:24 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Sep 30 06:02:24 localhost kernel: Using GB pages for direct mapping
Sep 30 06:02:24 localhost kernel: RAMDISK: [mem 0x2d7d0000-0x32bdffff]
Sep 30 06:02:24 localhost kernel: ACPI: Early table checksum verification disabled
Sep 30 06:02:24 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Sep 30 06:02:24 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Sep 30 06:02:24 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Sep 30 06:02:24 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Sep 30 06:02:24 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Sep 30 06:02:24 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Sep 30 06:02:24 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Sep 30 06:02:24 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Sep 30 06:02:24 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Sep 30 06:02:24 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Sep 30 06:02:24 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Sep 30 06:02:24 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Sep 30 06:02:24 localhost kernel: No NUMA configuration found
Sep 30 06:02:24 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Sep 30 06:02:24 localhost kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Sep 30 06:02:24 localhost kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Sep 30 06:02:24 localhost kernel: Zone ranges:
Sep 30 06:02:24 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Sep 30 06:02:24 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Sep 30 06:02:24 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000023fffffff]
Sep 30 06:02:24 localhost kernel:   Device   empty
Sep 30 06:02:24 localhost kernel: Movable zone start for each node
Sep 30 06:02:24 localhost kernel: Early memory node ranges
Sep 30 06:02:24 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Sep 30 06:02:24 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Sep 30 06:02:24 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000023fffffff]
Sep 30 06:02:24 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Sep 30 06:02:24 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Sep 30 06:02:24 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Sep 30 06:02:24 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Sep 30 06:02:24 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Sep 30 06:02:24 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Sep 30 06:02:24 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Sep 30 06:02:24 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Sep 30 06:02:24 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Sep 30 06:02:24 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Sep 30 06:02:24 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Sep 30 06:02:24 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Sep 30 06:02:24 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Sep 30 06:02:24 localhost kernel: TSC deadline timer available
Sep 30 06:02:24 localhost kernel: CPU topo: Max. logical packages:   8
Sep 30 06:02:24 localhost kernel: CPU topo: Max. logical dies:       8
Sep 30 06:02:24 localhost kernel: CPU topo: Max. dies per package:   1
Sep 30 06:02:24 localhost kernel: CPU topo: Max. threads per core:   1
Sep 30 06:02:24 localhost kernel: CPU topo: Num. cores per package:     1
Sep 30 06:02:24 localhost kernel: CPU topo: Num. threads per package:   1
Sep 30 06:02:24 localhost kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Sep 30 06:02:24 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Sep 30 06:02:24 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Sep 30 06:02:24 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Sep 30 06:02:24 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Sep 30 06:02:24 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Sep 30 06:02:24 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Sep 30 06:02:24 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Sep 30 06:02:24 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Sep 30 06:02:24 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Sep 30 06:02:24 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Sep 30 06:02:24 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Sep 30 06:02:24 localhost kernel: Booting paravirtualized kernel on KVM
Sep 30 06:02:24 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Sep 30 06:02:24 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Sep 30 06:02:24 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Sep 30 06:02:24 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u262144 alloc=1*2097152
Sep 30 06:02:24 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Sep 30 06:02:24 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Sep 30 06:02:24 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-617.el9.x86_64 root=UUID=d6a81468-b74c-4055-b485-def635ab40f8 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Sep 30 06:02:24 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-617.el9.x86_64", will be passed to user space.
Sep 30 06:02:24 localhost kernel: random: crng init done
Sep 30 06:02:24 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Sep 30 06:02:24 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Sep 30 06:02:24 localhost kernel: Fallback order for Node 0: 0 
Sep 30 06:02:24 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Sep 30 06:02:24 localhost kernel: Policy zone: Normal
Sep 30 06:02:24 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Sep 30 06:02:24 localhost kernel: software IO TLB: area num 8.
Sep 30 06:02:24 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Sep 30 06:02:24 localhost kernel: ftrace: allocating 49329 entries in 193 pages
Sep 30 06:02:24 localhost kernel: ftrace: allocated 193 pages with 3 groups
Sep 30 06:02:24 localhost kernel: Dynamic Preempt: voluntary
Sep 30 06:02:24 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Sep 30 06:02:24 localhost kernel: rcu:         RCU event tracing is enabled.
Sep 30 06:02:24 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Sep 30 06:02:24 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Sep 30 06:02:24 localhost kernel:         Rude variant of Tasks RCU enabled.
Sep 30 06:02:24 localhost kernel:         Tracing variant of Tasks RCU enabled.
Sep 30 06:02:24 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Sep 30 06:02:24 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Sep 30 06:02:24 localhost kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Sep 30 06:02:24 localhost kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Sep 30 06:02:24 localhost kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Sep 30 06:02:24 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Sep 30 06:02:24 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Sep 30 06:02:24 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Sep 30 06:02:24 localhost kernel: Console: colour VGA+ 80x25
Sep 30 06:02:24 localhost kernel: printk: console [ttyS0] enabled
Sep 30 06:02:24 localhost kernel: ACPI: Core revision 20230331
Sep 30 06:02:24 localhost kernel: APIC: Switch to symmetric I/O mode setup
Sep 30 06:02:24 localhost kernel: x2apic enabled
Sep 30 06:02:24 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Sep 30 06:02:24 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Sep 30 06:02:24 localhost kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Sep 30 06:02:24 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Sep 30 06:02:24 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Sep 30 06:02:24 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Sep 30 06:02:24 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Sep 30 06:02:24 localhost kernel: Spectre V2 : Mitigation: Retpolines
Sep 30 06:02:24 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Sep 30 06:02:24 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Sep 30 06:02:24 localhost kernel: RETBleed: Mitigation: untrained return thunk
Sep 30 06:02:24 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Sep 30 06:02:24 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Sep 30 06:02:24 localhost kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Sep 30 06:02:24 localhost kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Sep 30 06:02:24 localhost kernel: x86/bugs: return thunk changed
Sep 30 06:02:24 localhost kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Sep 30 06:02:24 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Sep 30 06:02:24 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Sep 30 06:02:24 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Sep 30 06:02:24 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Sep 30 06:02:24 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Sep 30 06:02:24 localhost kernel: Freeing SMP alternatives memory: 40K
Sep 30 06:02:24 localhost kernel: pid_max: default: 32768 minimum: 301
Sep 30 06:02:24 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Sep 30 06:02:24 localhost kernel: landlock: Up and running.
Sep 30 06:02:24 localhost kernel: Yama: becoming mindful.
Sep 30 06:02:24 localhost kernel: SELinux:  Initializing.
Sep 30 06:02:24 localhost kernel: LSM support for eBPF active
Sep 30 06:02:24 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Sep 30 06:02:24 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Sep 30 06:02:24 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Sep 30 06:02:24 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Sep 30 06:02:24 localhost kernel: ... version:                0
Sep 30 06:02:24 localhost kernel: ... bit width:              48
Sep 30 06:02:24 localhost kernel: ... generic registers:      6
Sep 30 06:02:24 localhost kernel: ... value mask:             0000ffffffffffff
Sep 30 06:02:24 localhost kernel: ... max period:             00007fffffffffff
Sep 30 06:02:24 localhost kernel: ... fixed-purpose events:   0
Sep 30 06:02:24 localhost kernel: ... event mask:             000000000000003f
Sep 30 06:02:24 localhost kernel: signal: max sigframe size: 1776
Sep 30 06:02:24 localhost kernel: rcu: Hierarchical SRCU implementation.
Sep 30 06:02:24 localhost kernel: rcu:         Max phase no-delay instances is 400.
Sep 30 06:02:24 localhost kernel: smp: Bringing up secondary CPUs ...
Sep 30 06:02:24 localhost kernel: smpboot: x86: Booting SMP configuration:
Sep 30 06:02:24 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Sep 30 06:02:24 localhost kernel: smp: Brought up 1 node, 8 CPUs
Sep 30 06:02:24 localhost kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Sep 30 06:02:24 localhost kernel: node 0 deferred pages initialised in 27ms
Sep 30 06:02:24 localhost kernel: Memory: 7765684K/8388068K available (16384K kernel code, 5784K rwdata, 13988K rodata, 4072K init, 7304K bss, 616480K reserved, 0K cma-reserved)
Sep 30 06:02:24 localhost kernel: devtmpfs: initialized
Sep 30 06:02:24 localhost kernel: x86/mm: Memory block size: 128MB
Sep 30 06:02:24 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Sep 30 06:02:24 localhost kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Sep 30 06:02:24 localhost kernel: pinctrl core: initialized pinctrl subsystem
Sep 30 06:02:24 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Sep 30 06:02:24 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Sep 30 06:02:24 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Sep 30 06:02:24 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Sep 30 06:02:24 localhost kernel: audit: initializing netlink subsys (disabled)
Sep 30 06:02:24 localhost kernel: audit: type=2000 audit(1759212142.860:1): state=initialized audit_enabled=0 res=1
Sep 30 06:02:24 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Sep 30 06:02:24 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Sep 30 06:02:24 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Sep 30 06:02:24 localhost kernel: cpuidle: using governor menu
Sep 30 06:02:24 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Sep 30 06:02:24 localhost kernel: PCI: Using configuration type 1 for base access
Sep 30 06:02:24 localhost kernel: PCI: Using configuration type 1 for extended access
Sep 30 06:02:24 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Sep 30 06:02:24 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Sep 30 06:02:24 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Sep 30 06:02:24 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Sep 30 06:02:24 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Sep 30 06:02:24 localhost kernel: Demotion targets for Node 0: null
Sep 30 06:02:24 localhost kernel: cryptd: max_cpu_qlen set to 1000
Sep 30 06:02:24 localhost kernel: ACPI: Added _OSI(Module Device)
Sep 30 06:02:24 localhost kernel: ACPI: Added _OSI(Processor Device)
Sep 30 06:02:24 localhost kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Sep 30 06:02:24 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Sep 30 06:02:24 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Sep 30 06:02:24 localhost kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Sep 30 06:02:24 localhost kernel: ACPI: Interpreter enabled
Sep 30 06:02:24 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Sep 30 06:02:24 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Sep 30 06:02:24 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Sep 30 06:02:24 localhost kernel: PCI: Using E820 reservations for host bridge windows
Sep 30 06:02:24 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Sep 30 06:02:24 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Sep 30 06:02:24 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Sep 30 06:02:24 localhost kernel: acpiphp: Slot [3] registered
Sep 30 06:02:24 localhost kernel: acpiphp: Slot [4] registered
Sep 30 06:02:24 localhost kernel: acpiphp: Slot [5] registered
Sep 30 06:02:24 localhost kernel: acpiphp: Slot [6] registered
Sep 30 06:02:24 localhost kernel: acpiphp: Slot [7] registered
Sep 30 06:02:24 localhost kernel: acpiphp: Slot [8] registered
Sep 30 06:02:24 localhost kernel: acpiphp: Slot [9] registered
Sep 30 06:02:24 localhost kernel: acpiphp: Slot [10] registered
Sep 30 06:02:24 localhost kernel: acpiphp: Slot [11] registered
Sep 30 06:02:24 localhost kernel: acpiphp: Slot [12] registered
Sep 30 06:02:24 localhost kernel: acpiphp: Slot [13] registered
Sep 30 06:02:24 localhost kernel: acpiphp: Slot [14] registered
Sep 30 06:02:24 localhost kernel: acpiphp: Slot [15] registered
Sep 30 06:02:24 localhost kernel: acpiphp: Slot [16] registered
Sep 30 06:02:24 localhost kernel: acpiphp: Slot [17] registered
Sep 30 06:02:24 localhost kernel: acpiphp: Slot [18] registered
Sep 30 06:02:24 localhost kernel: acpiphp: Slot [19] registered
Sep 30 06:02:24 localhost kernel: acpiphp: Slot [20] registered
Sep 30 06:02:24 localhost kernel: acpiphp: Slot [21] registered
Sep 30 06:02:24 localhost kernel: acpiphp: Slot [22] registered
Sep 30 06:02:24 localhost kernel: acpiphp: Slot [23] registered
Sep 30 06:02:24 localhost kernel: acpiphp: Slot [24] registered
Sep 30 06:02:24 localhost kernel: acpiphp: Slot [25] registered
Sep 30 06:02:24 localhost kernel: acpiphp: Slot [26] registered
Sep 30 06:02:24 localhost kernel: acpiphp: Slot [27] registered
Sep 30 06:02:24 localhost kernel: acpiphp: Slot [28] registered
Sep 30 06:02:24 localhost kernel: acpiphp: Slot [29] registered
Sep 30 06:02:24 localhost kernel: acpiphp: Slot [30] registered
Sep 30 06:02:24 localhost kernel: acpiphp: Slot [31] registered
Sep 30 06:02:24 localhost kernel: PCI host bridge to bus 0000:00
Sep 30 06:02:24 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Sep 30 06:02:24 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Sep 30 06:02:24 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Sep 30 06:02:24 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Sep 30 06:02:24 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Sep 30 06:02:24 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Sep 30 06:02:24 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Sep 30 06:02:24 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Sep 30 06:02:24 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Sep 30 06:02:24 localhost kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Sep 30 06:02:24 localhost kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Sep 30 06:02:24 localhost kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Sep 30 06:02:24 localhost kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Sep 30 06:02:24 localhost kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Sep 30 06:02:24 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Sep 30 06:02:24 localhost kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Sep 30 06:02:24 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Sep 30 06:02:24 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Sep 30 06:02:24 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Sep 30 06:02:24 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Sep 30 06:02:24 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Sep 30 06:02:24 localhost kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Sep 30 06:02:24 localhost kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Sep 30 06:02:24 localhost kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Sep 30 06:02:24 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Sep 30 06:02:24 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Sep 30 06:02:24 localhost kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Sep 30 06:02:24 localhost kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Sep 30 06:02:24 localhost kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Sep 30 06:02:24 localhost kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Sep 30 06:02:24 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Sep 30 06:02:24 localhost kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Sep 30 06:02:24 localhost kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Sep 30 06:02:24 localhost kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Sep 30 06:02:24 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Sep 30 06:02:24 localhost kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Sep 30 06:02:24 localhost kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Sep 30 06:02:24 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Sep 30 06:02:24 localhost kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Sep 30 06:02:24 localhost kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Sep 30 06:02:24 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Sep 30 06:02:24 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Sep 30 06:02:24 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Sep 30 06:02:24 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Sep 30 06:02:24 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Sep 30 06:02:24 localhost kernel: iommu: Default domain type: Translated
Sep 30 06:02:24 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Sep 30 06:02:24 localhost kernel: SCSI subsystem initialized
Sep 30 06:02:24 localhost kernel: ACPI: bus type USB registered
Sep 30 06:02:24 localhost kernel: usbcore: registered new interface driver usbfs
Sep 30 06:02:24 localhost kernel: usbcore: registered new interface driver hub
Sep 30 06:02:24 localhost kernel: usbcore: registered new device driver usb
Sep 30 06:02:24 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Sep 30 06:02:24 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Sep 30 06:02:24 localhost kernel: PTP clock support registered
Sep 30 06:02:24 localhost kernel: EDAC MC: Ver: 3.0.0
Sep 30 06:02:24 localhost kernel: NetLabel: Initializing
Sep 30 06:02:24 localhost kernel: NetLabel:  domain hash size = 128
Sep 30 06:02:24 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Sep 30 06:02:24 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Sep 30 06:02:24 localhost kernel: PCI: Using ACPI for IRQ routing
Sep 30 06:02:24 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Sep 30 06:02:24 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Sep 30 06:02:24 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Sep 30 06:02:24 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Sep 30 06:02:24 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Sep 30 06:02:24 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Sep 30 06:02:24 localhost kernel: vgaarb: loaded
Sep 30 06:02:24 localhost kernel: clocksource: Switched to clocksource kvm-clock
Sep 30 06:02:24 localhost kernel: VFS: Disk quotas dquot_6.6.0
Sep 30 06:02:24 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Sep 30 06:02:24 localhost kernel: pnp: PnP ACPI init
Sep 30 06:02:24 localhost kernel: pnp 00:03: [dma 2]
Sep 30 06:02:24 localhost kernel: pnp: PnP ACPI: found 5 devices
Sep 30 06:02:24 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Sep 30 06:02:24 localhost kernel: NET: Registered PF_INET protocol family
Sep 30 06:02:24 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Sep 30 06:02:24 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Sep 30 06:02:24 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Sep 30 06:02:24 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Sep 30 06:02:24 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Sep 30 06:02:24 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Sep 30 06:02:24 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Sep 30 06:02:24 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Sep 30 06:02:24 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Sep 30 06:02:24 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Sep 30 06:02:24 localhost kernel: NET: Registered PF_XDP protocol family
Sep 30 06:02:24 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Sep 30 06:02:24 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Sep 30 06:02:24 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Sep 30 06:02:24 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Sep 30 06:02:24 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Sep 30 06:02:24 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Sep 30 06:02:24 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Sep 30 06:02:24 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Sep 30 06:02:24 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x140 took 86238 usecs
Sep 30 06:02:24 localhost kernel: PCI: CLS 0 bytes, default 64
Sep 30 06:02:24 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Sep 30 06:02:24 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Sep 30 06:02:24 localhost kernel: ACPI: bus type thunderbolt registered
Sep 30 06:02:24 localhost kernel: Trying to unpack rootfs image as initramfs...
Sep 30 06:02:24 localhost kernel: Initialise system trusted keyrings
Sep 30 06:02:24 localhost kernel: Key type blacklist registered
Sep 30 06:02:24 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Sep 30 06:02:24 localhost kernel: zbud: loaded
Sep 30 06:02:24 localhost kernel: integrity: Platform Keyring initialized
Sep 30 06:02:24 localhost kernel: integrity: Machine keyring initialized
Sep 30 06:02:24 localhost kernel: Freeing initrd memory: 86080K
Sep 30 06:02:24 localhost kernel: NET: Registered PF_ALG protocol family
Sep 30 06:02:24 localhost kernel: xor: automatically using best checksumming function   avx       
Sep 30 06:02:24 localhost kernel: Key type asymmetric registered
Sep 30 06:02:24 localhost kernel: Asymmetric key parser 'x509' registered
Sep 30 06:02:24 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Sep 30 06:02:24 localhost kernel: io scheduler mq-deadline registered
Sep 30 06:02:24 localhost kernel: io scheduler kyber registered
Sep 30 06:02:24 localhost kernel: io scheduler bfq registered
Sep 30 06:02:24 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Sep 30 06:02:24 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Sep 30 06:02:24 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Sep 30 06:02:24 localhost kernel: ACPI: button: Power Button [PWRF]
Sep 30 06:02:24 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Sep 30 06:02:24 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Sep 30 06:02:24 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Sep 30 06:02:24 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Sep 30 06:02:24 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Sep 30 06:02:24 localhost kernel: Non-volatile memory driver v1.3
Sep 30 06:02:24 localhost kernel: rdac: device handler registered
Sep 30 06:02:24 localhost kernel: hp_sw: device handler registered
Sep 30 06:02:24 localhost kernel: emc: device handler registered
Sep 30 06:02:24 localhost kernel: alua: device handler registered
Sep 30 06:02:24 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Sep 30 06:02:24 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Sep 30 06:02:24 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Sep 30 06:02:24 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Sep 30 06:02:24 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Sep 30 06:02:24 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Sep 30 06:02:24 localhost kernel: usb usb1: Product: UHCI Host Controller
Sep 30 06:02:24 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-617.el9.x86_64 uhci_hcd
Sep 30 06:02:24 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Sep 30 06:02:24 localhost kernel: hub 1-0:1.0: USB hub found
Sep 30 06:02:24 localhost kernel: hub 1-0:1.0: 2 ports detected
Sep 30 06:02:24 localhost kernel: usbcore: registered new interface driver usbserial_generic
Sep 30 06:02:24 localhost kernel: usbserial: USB Serial support registered for generic
Sep 30 06:02:24 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Sep 30 06:02:24 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Sep 30 06:02:24 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Sep 30 06:02:24 localhost kernel: mousedev: PS/2 mouse device common for all mice
Sep 30 06:02:24 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Sep 30 06:02:24 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Sep 30 06:02:24 localhost kernel: rtc_cmos 00:04: registered as rtc0
Sep 30 06:02:24 localhost kernel: rtc_cmos 00:04: setting system clock to 2025-09-30T06:02:23 UTC (1759212143)
Sep 30 06:02:24 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Sep 30 06:02:24 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Sep 30 06:02:24 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Sep 30 06:02:24 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Sep 30 06:02:24 localhost kernel: usbcore: registered new interface driver usbhid
Sep 30 06:02:24 localhost kernel: usbhid: USB HID core driver
Sep 30 06:02:24 localhost kernel: drop_monitor: Initializing network drop monitor service
Sep 30 06:02:24 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Sep 30 06:02:24 localhost kernel: Initializing XFRM netlink socket
Sep 30 06:02:24 localhost kernel: NET: Registered PF_INET6 protocol family
Sep 30 06:02:24 localhost kernel: Segment Routing with IPv6
Sep 30 06:02:24 localhost kernel: NET: Registered PF_PACKET protocol family
Sep 30 06:02:24 localhost kernel: mpls_gso: MPLS GSO support
Sep 30 06:02:24 localhost kernel: IPI shorthand broadcast: enabled
Sep 30 06:02:24 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Sep 30 06:02:24 localhost kernel: AES CTR mode by8 optimization enabled
Sep 30 06:02:24 localhost kernel: sched_clock: Marking stable (1318020990, 142559700)->(1595762530, -135181840)
Sep 30 06:02:24 localhost kernel: registered taskstats version 1
Sep 30 06:02:24 localhost kernel: Loading compiled-in X.509 certificates
Sep 30 06:02:24 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: bb2966091bafcba340f8183756023c985dcc8fe9'
Sep 30 06:02:24 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Sep 30 06:02:24 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Sep 30 06:02:24 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Sep 30 06:02:24 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Sep 30 06:02:24 localhost kernel: Demotion targets for Node 0: null
Sep 30 06:02:24 localhost kernel: page_owner is disabled
Sep 30 06:02:24 localhost kernel: Key type .fscrypt registered
Sep 30 06:02:24 localhost kernel: Key type fscrypt-provisioning registered
Sep 30 06:02:24 localhost kernel: Key type big_key registered
Sep 30 06:02:24 localhost kernel: Key type encrypted registered
Sep 30 06:02:24 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Sep 30 06:02:24 localhost kernel: Loading compiled-in module X.509 certificates
Sep 30 06:02:24 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: bb2966091bafcba340f8183756023c985dcc8fe9'
Sep 30 06:02:24 localhost kernel: ima: Allocated hash algorithm: sha256
Sep 30 06:02:24 localhost kernel: ima: No architecture policies found
Sep 30 06:02:24 localhost kernel: evm: Initialising EVM extended attributes:
Sep 30 06:02:24 localhost kernel: evm: security.selinux
Sep 30 06:02:24 localhost kernel: evm: security.SMACK64 (disabled)
Sep 30 06:02:24 localhost kernel: evm: security.SMACK64EXEC (disabled)
Sep 30 06:02:24 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Sep 30 06:02:24 localhost kernel: evm: security.SMACK64MMAP (disabled)
Sep 30 06:02:24 localhost kernel: evm: security.apparmor (disabled)
Sep 30 06:02:24 localhost kernel: evm: security.ima
Sep 30 06:02:24 localhost kernel: evm: security.capability
Sep 30 06:02:24 localhost kernel: evm: HMAC attrs: 0x1
Sep 30 06:02:24 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Sep 30 06:02:24 localhost kernel: Running certificate verification RSA selftest
Sep 30 06:02:24 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Sep 30 06:02:24 localhost kernel: Running certificate verification ECDSA selftest
Sep 30 06:02:24 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Sep 30 06:02:24 localhost kernel: clk: Disabling unused clocks
Sep 30 06:02:24 localhost kernel: Freeing unused decrypted memory: 2028K
Sep 30 06:02:24 localhost kernel: Freeing unused kernel image (initmem) memory: 4072K
Sep 30 06:02:24 localhost kernel: Write protecting the kernel read-only data: 30720k
Sep 30 06:02:24 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 348K
Sep 30 06:02:24 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Sep 30 06:02:24 localhost kernel: Run /init as init process
Sep 30 06:02:24 localhost kernel:   with arguments:
Sep 30 06:02:24 localhost kernel:     /init
Sep 30 06:02:24 localhost kernel:   with environment:
Sep 30 06:02:24 localhost kernel:     HOME=/
Sep 30 06:02:24 localhost kernel:     TERM=linux
Sep 30 06:02:24 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-617.el9.x86_64
Sep 30 06:02:24 localhost systemd[1]: systemd 252-55.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Sep 30 06:02:24 localhost systemd[1]: Detected virtualization kvm.
Sep 30 06:02:24 localhost systemd[1]: Detected architecture x86-64.
Sep 30 06:02:24 localhost systemd[1]: Running in initrd.
Sep 30 06:02:24 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Sep 30 06:02:24 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Sep 30 06:02:24 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Sep 30 06:02:24 localhost kernel: usb 1-1: Manufacturer: QEMU
Sep 30 06:02:24 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Sep 30 06:02:24 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Sep 30 06:02:24 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Sep 30 06:02:24 localhost systemd[1]: No hostname configured, using default hostname.
Sep 30 06:02:24 localhost systemd[1]: Hostname set to <localhost>.
Sep 30 06:02:24 localhost systemd[1]: Initializing machine ID from VM UUID.
Sep 30 06:02:24 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Sep 30 06:02:24 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Sep 30 06:02:24 localhost systemd[1]: Reached target Local Encrypted Volumes.
Sep 30 06:02:24 localhost systemd[1]: Reached target Initrd /usr File System.
Sep 30 06:02:24 localhost systemd[1]: Reached target Local File Systems.
Sep 30 06:02:24 localhost systemd[1]: Reached target Path Units.
Sep 30 06:02:24 localhost systemd[1]: Reached target Slice Units.
Sep 30 06:02:24 localhost systemd[1]: Reached target Swaps.
Sep 30 06:02:24 localhost systemd[1]: Reached target Timer Units.
Sep 30 06:02:24 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Sep 30 06:02:24 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Sep 30 06:02:24 localhost systemd[1]: Listening on Journal Socket.
Sep 30 06:02:24 localhost systemd[1]: Listening on udev Control Socket.
Sep 30 06:02:24 localhost systemd[1]: Listening on udev Kernel Socket.
Sep 30 06:02:24 localhost systemd[1]: Reached target Socket Units.
Sep 30 06:02:24 localhost systemd[1]: Starting Create List of Static Device Nodes...
Sep 30 06:02:24 localhost systemd[1]: Starting Journal Service...
Sep 30 06:02:24 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Sep 30 06:02:24 localhost systemd[1]: Starting Apply Kernel Variables...
Sep 30 06:02:24 localhost systemd[1]: Starting Create System Users...
Sep 30 06:02:24 localhost systemd[1]: Starting Setup Virtual Console...
Sep 30 06:02:24 localhost systemd[1]: Finished Create List of Static Device Nodes.
Sep 30 06:02:24 localhost systemd[1]: Finished Apply Kernel Variables.
Sep 30 06:02:24 localhost systemd[1]: Finished Create System Users.
Sep 30 06:02:24 localhost systemd-journald[309]: Journal started
Sep 30 06:02:24 localhost systemd-journald[309]: Runtime Journal (/run/log/journal/27011b05154b49a8b7af428b3312a4f6) is 8.0M, max 153.5M, 145.5M free.
Sep 30 06:02:24 localhost systemd-sysusers[314]: Creating group 'users' with GID 100.
Sep 30 06:02:24 localhost systemd-sysusers[314]: Creating group 'dbus' with GID 81.
Sep 30 06:02:24 localhost systemd-sysusers[314]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Sep 30 06:02:24 localhost systemd[1]: Started Journal Service.
Sep 30 06:02:24 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Sep 30 06:02:24 localhost systemd[1]: Starting Create Volatile Files and Directories...
Sep 30 06:02:24 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Sep 30 06:02:24 localhost systemd[1]: Finished Create Volatile Files and Directories.
Sep 30 06:02:25 localhost systemd[1]: Finished Setup Virtual Console.
Sep 30 06:02:25 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Sep 30 06:02:25 localhost systemd[1]: Starting dracut cmdline hook...
Sep 30 06:02:25 localhost dracut-cmdline[330]: dracut-9 dracut-057-102.git20250818.el9
Sep 30 06:02:25 localhost dracut-cmdline[330]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-617.el9.x86_64 root=UUID=d6a81468-b74c-4055-b485-def635ab40f8 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Sep 30 06:02:25 localhost systemd[1]: Finished dracut cmdline hook.
Sep 30 06:02:25 localhost systemd[1]: Starting dracut pre-udev hook...
Sep 30 06:02:25 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Sep 30 06:02:25 localhost kernel: device-mapper: uevent: version 1.0.3
Sep 30 06:02:25 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Sep 30 06:02:25 localhost kernel: RPC: Registered named UNIX socket transport module.
Sep 30 06:02:25 localhost kernel: RPC: Registered udp transport module.
Sep 30 06:02:25 localhost kernel: RPC: Registered tcp transport module.
Sep 30 06:02:25 localhost kernel: RPC: Registered tcp-with-tls transport module.
Sep 30 06:02:25 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Sep 30 06:02:25 localhost rpc.statd[447]: Version 2.5.4 starting
Sep 30 06:02:25 localhost rpc.statd[447]: Initializing NSM state
Sep 30 06:02:25 localhost rpc.idmapd[452]: Setting log level to 0
Sep 30 06:02:25 localhost systemd[1]: Finished dracut pre-udev hook.
Sep 30 06:02:25 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Sep 30 06:02:25 localhost systemd-udevd[465]: Using default interface naming scheme 'rhel-9.0'.
Sep 30 06:02:25 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Sep 30 06:02:25 localhost systemd[1]: Starting dracut pre-trigger hook...
Sep 30 06:02:25 localhost systemd[1]: Finished dracut pre-trigger hook.
Sep 30 06:02:25 localhost systemd[1]: Starting Coldplug All udev Devices...
Sep 30 06:02:25 localhost systemd[1]: Created slice Slice /system/modprobe.
Sep 30 06:02:25 localhost systemd[1]: Starting Load Kernel Module configfs...
Sep 30 06:02:25 localhost systemd[1]: Finished Coldplug All udev Devices.
Sep 30 06:02:25 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Sep 30 06:02:25 localhost systemd[1]: Finished Load Kernel Module configfs.
Sep 30 06:02:25 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Sep 30 06:02:25 localhost systemd[1]: Reached target Network.
Sep 30 06:02:25 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Sep 30 06:02:25 localhost systemd[1]: Starting dracut initqueue hook...
Sep 30 06:02:25 localhost kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Sep 30 06:02:25 localhost systemd-udevd[468]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 06:02:25 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Sep 30 06:02:25 localhost kernel:  vda: vda1
Sep 30 06:02:25 localhost kernel: libata version 3.00 loaded.
Sep 30 06:02:25 localhost systemd[1]: Mounting Kernel Configuration File System...
Sep 30 06:02:25 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Sep 30 06:02:25 localhost systemd[1]: Mounted Kernel Configuration File System.
Sep 30 06:02:25 localhost kernel: scsi host0: ata_piix
Sep 30 06:02:25 localhost kernel: scsi host1: ata_piix
Sep 30 06:02:25 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Sep 30 06:02:25 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Sep 30 06:02:25 localhost systemd[1]: Found device /dev/disk/by-uuid/d6a81468-b74c-4055-b485-def635ab40f8.
Sep 30 06:02:25 localhost systemd[1]: Reached target Initrd Root Device.
Sep 30 06:02:25 localhost systemd[1]: Reached target System Initialization.
Sep 30 06:02:25 localhost systemd[1]: Reached target Basic System.
Sep 30 06:02:26 localhost kernel: ata1: found unknown device (class 0)
Sep 30 06:02:26 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Sep 30 06:02:26 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Sep 30 06:02:26 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Sep 30 06:02:26 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Sep 30 06:02:26 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Sep 30 06:02:26 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Sep 30 06:02:26 localhost systemd[1]: Finished dracut initqueue hook.
Sep 30 06:02:26 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Sep 30 06:02:26 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Sep 30 06:02:26 localhost systemd[1]: Reached target Remote File Systems.
Sep 30 06:02:26 localhost systemd[1]: Starting dracut pre-mount hook...
Sep 30 06:02:26 localhost systemd[1]: Finished dracut pre-mount hook.
Sep 30 06:02:26 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/d6a81468-b74c-4055-b485-def635ab40f8...
Sep 30 06:02:26 localhost systemd-fsck[557]: /usr/sbin/fsck.xfs: XFS file system.
Sep 30 06:02:26 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/d6a81468-b74c-4055-b485-def635ab40f8.
Sep 30 06:02:26 localhost systemd[1]: Mounting /sysroot...
Sep 30 06:02:26 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Sep 30 06:02:26 localhost kernel: XFS (vda1): Mounting V5 Filesystem d6a81468-b74c-4055-b485-def635ab40f8
Sep 30 06:02:26 localhost kernel: XFS (vda1): Ending clean mount
Sep 30 06:02:26 localhost systemd[1]: Mounted /sysroot.
Sep 30 06:02:26 localhost systemd[1]: Reached target Initrd Root File System.
Sep 30 06:02:26 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Sep 30 06:02:26 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Sep 30 06:02:26 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Sep 30 06:02:26 localhost systemd[1]: Reached target Initrd File Systems.
Sep 30 06:02:26 localhost systemd[1]: Reached target Initrd Default Target.
Sep 30 06:02:26 localhost systemd[1]: Starting dracut mount hook...
Sep 30 06:02:26 localhost systemd[1]: Finished dracut mount hook.
Sep 30 06:02:27 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Sep 30 06:02:27 localhost rpc.idmapd[452]: exiting on signal 15
Sep 30 06:02:27 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Sep 30 06:02:27 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Sep 30 06:02:27 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Sep 30 06:02:27 localhost systemd[1]: Stopped target Network.
Sep 30 06:02:27 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Sep 30 06:02:27 localhost systemd[1]: Stopped target Timer Units.
Sep 30 06:02:27 localhost systemd[1]: dbus.socket: Deactivated successfully.
Sep 30 06:02:27 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Sep 30 06:02:27 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Sep 30 06:02:27 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Sep 30 06:02:27 localhost systemd[1]: Stopped target Initrd Default Target.
Sep 30 06:02:27 localhost systemd[1]: Stopped target Basic System.
Sep 30 06:02:27 localhost systemd[1]: Stopped target Initrd Root Device.
Sep 30 06:02:27 localhost systemd[1]: Stopped target Initrd /usr File System.
Sep 30 06:02:27 localhost systemd[1]: Stopped target Path Units.
Sep 30 06:02:27 localhost systemd[1]: Stopped target Remote File Systems.
Sep 30 06:02:27 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Sep 30 06:02:27 localhost systemd[1]: Stopped target Slice Units.
Sep 30 06:02:27 localhost systemd[1]: Stopped target Socket Units.
Sep 30 06:02:27 localhost systemd[1]: Stopped target System Initialization.
Sep 30 06:02:27 localhost systemd[1]: Stopped target Local File Systems.
Sep 30 06:02:27 localhost systemd[1]: Stopped target Swaps.
Sep 30 06:02:27 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Sep 30 06:02:27 localhost systemd[1]: Stopped dracut mount hook.
Sep 30 06:02:27 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Sep 30 06:02:27 localhost systemd[1]: Stopped dracut pre-mount hook.
Sep 30 06:02:27 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Sep 30 06:02:27 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Sep 30 06:02:27 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Sep 30 06:02:27 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Sep 30 06:02:27 localhost systemd[1]: Stopped dracut initqueue hook.
Sep 30 06:02:27 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Sep 30 06:02:27 localhost systemd[1]: Stopped Apply Kernel Variables.
Sep 30 06:02:27 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Sep 30 06:02:27 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Sep 30 06:02:27 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Sep 30 06:02:27 localhost systemd[1]: Stopped Coldplug All udev Devices.
Sep 30 06:02:27 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Sep 30 06:02:27 localhost systemd[1]: Stopped dracut pre-trigger hook.
Sep 30 06:02:27 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Sep 30 06:02:27 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Sep 30 06:02:27 localhost systemd[1]: Stopped Setup Virtual Console.
Sep 30 06:02:27 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Sep 30 06:02:27 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Sep 30 06:02:27 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Sep 30 06:02:27 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Sep 30 06:02:27 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Sep 30 06:02:27 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Sep 30 06:02:27 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Sep 30 06:02:27 localhost systemd[1]: Closed udev Control Socket.
Sep 30 06:02:27 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Sep 30 06:02:27 localhost systemd[1]: Closed udev Kernel Socket.
Sep 30 06:02:27 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Sep 30 06:02:27 localhost systemd[1]: Stopped dracut pre-udev hook.
Sep 30 06:02:27 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Sep 30 06:02:27 localhost systemd[1]: Stopped dracut cmdline hook.
Sep 30 06:02:27 localhost systemd[1]: Starting Cleanup udev Database...
Sep 30 06:02:27 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Sep 30 06:02:27 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Sep 30 06:02:27 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Sep 30 06:02:27 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Sep 30 06:02:27 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Sep 30 06:02:27 localhost systemd[1]: Stopped Create System Users.
Sep 30 06:02:27 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Sep 30 06:02:27 localhost systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Sep 30 06:02:27 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Sep 30 06:02:27 localhost systemd[1]: Finished Cleanup udev Database.
Sep 30 06:02:27 localhost systemd[1]: Reached target Switch Root.
Sep 30 06:02:27 localhost systemd[1]: Starting Switch Root...
Sep 30 06:02:27 localhost systemd[1]: Switching root.
Sep 30 06:02:27 localhost systemd-journald[309]: Journal stopped
Sep 30 06:02:28 localhost systemd-journald[309]: Received SIGTERM from PID 1 (systemd).
Sep 30 06:02:28 localhost kernel: audit: type=1404 audit(1759212147.505:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Sep 30 06:02:28 localhost kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 06:02:28 localhost kernel: SELinux:  policy capability open_perms=1
Sep 30 06:02:28 localhost kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 06:02:28 localhost kernel: SELinux:  policy capability always_check_network=0
Sep 30 06:02:28 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 06:02:28 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 06:02:28 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 06:02:28 localhost kernel: audit: type=1403 audit(1759212147.659:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Sep 30 06:02:28 localhost systemd[1]: Successfully loaded SELinux policy in 159.463ms.
Sep 30 06:02:28 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 33.985ms.
Sep 30 06:02:28 localhost systemd[1]: systemd 252-55.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Sep 30 06:02:28 localhost systemd[1]: Detected virtualization kvm.
Sep 30 06:02:28 localhost systemd[1]: Detected architecture x86-64.
Sep 30 06:02:28 localhost systemd-rc-local-generator[637]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:02:28 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Sep 30 06:02:28 localhost systemd[1]: Stopped Switch Root.
Sep 30 06:02:28 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Sep 30 06:02:28 localhost systemd[1]: Created slice Slice /system/getty.
Sep 30 06:02:28 localhost systemd[1]: Created slice Slice /system/serial-getty.
Sep 30 06:02:28 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Sep 30 06:02:28 localhost systemd[1]: Created slice User and Session Slice.
Sep 30 06:02:28 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Sep 30 06:02:28 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Sep 30 06:02:28 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Sep 30 06:02:28 localhost systemd[1]: Reached target Local Encrypted Volumes.
Sep 30 06:02:28 localhost systemd[1]: Stopped target Switch Root.
Sep 30 06:02:28 localhost systemd[1]: Stopped target Initrd File Systems.
Sep 30 06:02:28 localhost systemd[1]: Stopped target Initrd Root File System.
Sep 30 06:02:28 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Sep 30 06:02:28 localhost systemd[1]: Reached target Path Units.
Sep 30 06:02:28 localhost systemd[1]: Reached target rpc_pipefs.target.
Sep 30 06:02:28 localhost systemd[1]: Reached target Slice Units.
Sep 30 06:02:28 localhost systemd[1]: Reached target Swaps.
Sep 30 06:02:28 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Sep 30 06:02:28 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Sep 30 06:02:28 localhost systemd[1]: Reached target RPC Port Mapper.
Sep 30 06:02:28 localhost systemd[1]: Listening on Process Core Dump Socket.
Sep 30 06:02:28 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Sep 30 06:02:28 localhost systemd[1]: Listening on udev Control Socket.
Sep 30 06:02:28 localhost systemd[1]: Listening on udev Kernel Socket.
Sep 30 06:02:28 localhost systemd[1]: Mounting Huge Pages File System...
Sep 30 06:02:28 localhost systemd[1]: Mounting POSIX Message Queue File System...
Sep 30 06:02:28 localhost systemd[1]: Mounting Kernel Debug File System...
Sep 30 06:02:28 localhost systemd[1]: Mounting Kernel Trace File System...
Sep 30 06:02:28 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Sep 30 06:02:28 localhost systemd[1]: Starting Create List of Static Device Nodes...
Sep 30 06:02:28 localhost systemd[1]: Starting Load Kernel Module configfs...
Sep 30 06:02:28 localhost systemd[1]: Starting Load Kernel Module drm...
Sep 30 06:02:28 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Sep 30 06:02:28 localhost systemd[1]: Starting Load Kernel Module fuse...
Sep 30 06:02:28 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Sep 30 06:02:28 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Sep 30 06:02:28 localhost systemd[1]: Stopped File System Check on Root Device.
Sep 30 06:02:28 localhost systemd[1]: Stopped Journal Service.
Sep 30 06:02:28 localhost systemd[1]: Starting Journal Service...
Sep 30 06:02:28 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Sep 30 06:02:28 localhost systemd[1]: Starting Generate network units from Kernel command line...
Sep 30 06:02:28 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Sep 30 06:02:28 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Sep 30 06:02:28 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Sep 30 06:02:28 localhost systemd[1]: Starting Apply Kernel Variables...
Sep 30 06:02:28 localhost kernel: fuse: init (API version 7.37)
Sep 30 06:02:28 localhost systemd[1]: Starting Coldplug All udev Devices...
Sep 30 06:02:28 localhost systemd[1]: Mounted Huge Pages File System.
Sep 30 06:02:28 localhost systemd[1]: Mounted POSIX Message Queue File System.
Sep 30 06:02:28 localhost systemd[1]: Mounted Kernel Debug File System.
Sep 30 06:02:28 localhost systemd[1]: Mounted Kernel Trace File System.
Sep 30 06:02:28 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Sep 30 06:02:28 localhost systemd-journald[678]: Journal started
Sep 30 06:02:28 localhost systemd-journald[678]: Runtime Journal (/run/log/journal/21983c68f36a73745cc172a394ebc51d) is 8.0M, max 153.5M, 145.5M free.
Sep 30 06:02:28 localhost systemd[1]: Queued start job for default target Multi-User System.
Sep 30 06:02:28 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Sep 30 06:02:28 localhost systemd[1]: Started Journal Service.
Sep 30 06:02:28 localhost systemd[1]: Finished Create List of Static Device Nodes.
Sep 30 06:02:28 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Sep 30 06:02:28 localhost systemd[1]: Finished Load Kernel Module configfs.
Sep 30 06:02:28 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Sep 30 06:02:28 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Sep 30 06:02:28 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Sep 30 06:02:28 localhost systemd[1]: Finished Load Kernel Module fuse.
Sep 30 06:02:28 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Sep 30 06:02:28 localhost systemd[1]: Finished Generate network units from Kernel command line.
Sep 30 06:02:28 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Sep 30 06:02:28 localhost systemd[1]: Finished Apply Kernel Variables.
Sep 30 06:02:28 localhost kernel: ACPI: bus type drm_connector registered
Sep 30 06:02:28 localhost systemd[1]: Mounting FUSE Control File System...
Sep 30 06:02:28 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Sep 30 06:02:28 localhost systemd[1]: Starting Rebuild Hardware Database...
Sep 30 06:02:28 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Sep 30 06:02:28 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Sep 30 06:02:28 localhost systemd[1]: Starting Load/Save OS Random Seed...
Sep 30 06:02:28 localhost systemd[1]: Starting Create System Users...
Sep 30 06:02:28 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Sep 30 06:02:28 localhost systemd[1]: Finished Load Kernel Module drm.
Sep 30 06:02:28 localhost systemd[1]: Mounted FUSE Control File System.
Sep 30 06:02:28 localhost systemd-journald[678]: Runtime Journal (/run/log/journal/21983c68f36a73745cc172a394ebc51d) is 8.0M, max 153.5M, 145.5M free.
Sep 30 06:02:28 localhost systemd-journald[678]: Received client request to flush runtime journal.
Sep 30 06:02:28 localhost systemd[1]: Finished Load/Save OS Random Seed.
Sep 30 06:02:28 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Sep 30 06:02:28 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Sep 30 06:02:28 localhost systemd[1]: Finished Create System Users.
Sep 30 06:02:28 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Sep 30 06:02:28 localhost systemd[1]: Finished Coldplug All udev Devices.
Sep 30 06:02:28 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Sep 30 06:02:28 localhost systemd[1]: Reached target Preparation for Local File Systems.
Sep 30 06:02:28 localhost systemd[1]: Reached target Local File Systems.
Sep 30 06:02:28 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Sep 30 06:02:28 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Sep 30 06:02:28 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Sep 30 06:02:28 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Sep 30 06:02:28 localhost systemd[1]: Starting Automatic Boot Loader Update...
Sep 30 06:02:28 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Sep 30 06:02:28 localhost systemd[1]: Starting Create Volatile Files and Directories...
Sep 30 06:02:28 localhost bootctl[697]: Couldn't find EFI system partition, skipping.
Sep 30 06:02:28 localhost systemd[1]: Finished Automatic Boot Loader Update.
Sep 30 06:02:28 localhost systemd[1]: Finished Create Volatile Files and Directories.
Sep 30 06:02:28 localhost systemd[1]: Starting Security Auditing Service...
Sep 30 06:02:28 localhost systemd[1]: Starting RPC Bind...
Sep 30 06:02:28 localhost systemd[1]: Starting Rebuild Journal Catalog...
Sep 30 06:02:28 localhost auditd[703]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Sep 30 06:02:28 localhost auditd[703]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Sep 30 06:02:28 localhost systemd[1]: Finished Rebuild Journal Catalog.
Sep 30 06:02:28 localhost systemd[1]: Started RPC Bind.
Sep 30 06:02:28 localhost augenrules[708]: /sbin/augenrules: No change
Sep 30 06:02:28 localhost augenrules[723]: No rules
Sep 30 06:02:28 localhost augenrules[723]: enabled 1
Sep 30 06:02:28 localhost augenrules[723]: failure 1
Sep 30 06:02:28 localhost augenrules[723]: pid 703
Sep 30 06:02:28 localhost augenrules[723]: rate_limit 0
Sep 30 06:02:28 localhost augenrules[723]: backlog_limit 8192
Sep 30 06:02:28 localhost augenrules[723]: lost 0
Sep 30 06:02:28 localhost augenrules[723]: backlog 0
Sep 30 06:02:28 localhost augenrules[723]: backlog_wait_time 60000
Sep 30 06:02:28 localhost augenrules[723]: backlog_wait_time_actual 0
Sep 30 06:02:28 localhost augenrules[723]: enabled 1
Sep 30 06:02:28 localhost augenrules[723]: failure 1
Sep 30 06:02:28 localhost augenrules[723]: pid 703
Sep 30 06:02:28 localhost augenrules[723]: rate_limit 0
Sep 30 06:02:28 localhost augenrules[723]: backlog_limit 8192
Sep 30 06:02:28 localhost augenrules[723]: lost 0
Sep 30 06:02:28 localhost augenrules[723]: backlog 0
Sep 30 06:02:28 localhost augenrules[723]: backlog_wait_time 60000
Sep 30 06:02:28 localhost augenrules[723]: backlog_wait_time_actual 0
Sep 30 06:02:28 localhost augenrules[723]: enabled 1
Sep 30 06:02:28 localhost augenrules[723]: failure 1
Sep 30 06:02:28 localhost augenrules[723]: pid 703
Sep 30 06:02:28 localhost augenrules[723]: rate_limit 0
Sep 30 06:02:28 localhost augenrules[723]: backlog_limit 8192
Sep 30 06:02:28 localhost augenrules[723]: lost 0
Sep 30 06:02:28 localhost augenrules[723]: backlog 0
Sep 30 06:02:28 localhost augenrules[723]: backlog_wait_time 60000
Sep 30 06:02:28 localhost augenrules[723]: backlog_wait_time_actual 0
Sep 30 06:02:28 localhost systemd[1]: Started Security Auditing Service.
Sep 30 06:02:28 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Sep 30 06:02:28 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Sep 30 06:02:28 localhost systemd[1]: Finished Rebuild Hardware Database.
Sep 30 06:02:28 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Sep 30 06:02:29 localhost systemd-udevd[732]: Using default interface naming scheme 'rhel-9.0'.
Sep 30 06:02:29 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Sep 30 06:02:29 localhost systemd[1]: Starting Load Kernel Module configfs...
Sep 30 06:02:29 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Sep 30 06:02:29 localhost systemd[1]: Starting Update is Completed...
Sep 30 06:02:29 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Sep 30 06:02:29 localhost systemd[1]: Finished Load Kernel Module configfs.
Sep 30 06:02:29 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Sep 30 06:02:29 localhost systemd[1]: Finished Update is Completed.
Sep 30 06:02:29 localhost systemd-udevd[749]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 06:02:29 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Sep 30 06:02:29 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Sep 30 06:02:29 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Sep 30 06:02:29 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Sep 30 06:02:29 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Sep 30 06:02:29 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Sep 30 06:02:29 localhost kernel: Console: switching to colour dummy device 80x25
Sep 30 06:02:29 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Sep 30 06:02:29 localhost kernel: [drm] features: -context_init
Sep 30 06:02:29 localhost kernel: [drm] number of scanouts: 1
Sep 30 06:02:29 localhost kernel: [drm] number of cap sets: 0
Sep 30 06:02:29 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Sep 30 06:02:29 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Sep 30 06:02:29 localhost kernel: kvm_amd: TSC scaling supported
Sep 30 06:02:29 localhost kernel: kvm_amd: Nested Virtualization enabled
Sep 30 06:02:29 localhost kernel: kvm_amd: Nested Paging enabled
Sep 30 06:02:29 localhost kernel: kvm_amd: LBR virtualization supported
Sep 30 06:02:29 localhost kernel: Console: switching to colour frame buffer device 128x48
Sep 30 06:02:29 localhost kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Sep 30 06:02:29 localhost systemd[1]: Reached target System Initialization.
Sep 30 06:02:29 localhost systemd[1]: Started dnf makecache --timer.
Sep 30 06:02:29 localhost systemd[1]: Started Daily rotation of log files.
Sep 30 06:02:29 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Sep 30 06:02:29 localhost systemd[1]: Reached target Timer Units.
Sep 30 06:02:29 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Sep 30 06:02:29 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Sep 30 06:02:29 localhost systemd[1]: Reached target Socket Units.
Sep 30 06:02:29 localhost systemd[1]: Starting D-Bus System Message Bus...
Sep 30 06:02:29 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Sep 30 06:02:29 localhost systemd[1]: Started D-Bus System Message Bus.
Sep 30 06:02:29 localhost systemd[1]: Reached target Basic System.
Sep 30 06:02:29 localhost dbus-broker-lau[807]: Ready
Sep 30 06:02:29 localhost systemd[1]: Starting NTP client/server...
Sep 30 06:02:29 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Sep 30 06:02:29 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Sep 30 06:02:29 localhost systemd[1]: Starting IPv4 firewall with iptables...
Sep 30 06:02:29 localhost systemd[1]: Started irqbalance daemon.
Sep 30 06:02:29 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Sep 30 06:02:29 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Sep 30 06:02:29 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Sep 30 06:02:29 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Sep 30 06:02:29 localhost systemd[1]: Reached target sshd-keygen.target.
Sep 30 06:02:29 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Sep 30 06:02:29 localhost systemd[1]: Reached target User and Group Name Lookups.
Sep 30 06:02:29 localhost systemd[1]: Starting User Login Management...
Sep 30 06:02:29 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Sep 30 06:02:29 localhost systemd-logind[824]: New seat seat0.
Sep 30 06:02:29 localhost systemd-logind[824]: Watching system buttons on /dev/input/event0 (Power Button)
Sep 30 06:02:29 localhost systemd-logind[824]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Sep 30 06:02:29 localhost systemd[1]: Started User Login Management.
Sep 30 06:02:29 localhost chronyd[831]: chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Sep 30 06:02:29 localhost chronyd[831]: Loaded 0 symmetric keys
Sep 30 06:02:29 localhost chronyd[831]: Using right/UTC timezone to obtain leap second data
Sep 30 06:02:29 localhost chronyd[831]: Loaded seccomp filter (level 2)
Sep 30 06:02:29 localhost systemd[1]: Started NTP client/server.
Sep 30 06:02:29 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Sep 30 06:02:29 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Sep 30 06:02:29 localhost iptables.init[818]: iptables: Applying firewall rules: [  OK  ]
Sep 30 06:02:29 localhost systemd[1]: Finished IPv4 firewall with iptables.
Sep 30 06:02:30 localhost cloud-init[841]: Cloud-init v. 24.4-7.el9 running 'init-local' at Tue, 30 Sep 2025 06:02:30 +0000. Up 7.90 seconds.
Sep 30 06:02:30 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Sep 30 06:02:30 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Sep 30 06:02:30 localhost systemd[1]: run-cloud\x2dinit-tmp-tmpify1rxnw.mount: Deactivated successfully.
Sep 30 06:02:30 localhost systemd[1]: Starting Hostname Service...
Sep 30 06:02:30 localhost systemd[1]: Started Hostname Service.
Sep 30 06:02:30 np0005461738.novalocal systemd-hostnamed[855]: Hostname set to <np0005461738.novalocal> (static)
Sep 30 06:02:30 np0005461738.novalocal systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Sep 30 06:02:30 np0005461738.novalocal systemd[1]: Reached target Preparation for Network.
Sep 30 06:02:30 np0005461738.novalocal systemd[1]: Starting Network Manager...
Sep 30 06:02:30 np0005461738.novalocal NetworkManager[859]: <info>  [1759212150.7922] NetworkManager (version 1.54.1-1.el9) is starting... (boot:5ec163b0-1932-4293-bd17-8c478fff576e)
Sep 30 06:02:30 np0005461738.novalocal NetworkManager[859]: <info>  [1759212150.7927] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Sep 30 06:02:30 np0005461738.novalocal NetworkManager[859]: <info>  [1759212150.8144] manager[0x563b07448080]: monitoring kernel firmware directory '/lib/firmware'.
Sep 30 06:02:30 np0005461738.novalocal NetworkManager[859]: <info>  [1759212150.8197] hostname: hostname: using hostnamed
Sep 30 06:02:30 np0005461738.novalocal NetworkManager[859]: <info>  [1759212150.8197] hostname: static hostname changed from (none) to "np0005461738.novalocal"
Sep 30 06:02:30 np0005461738.novalocal NetworkManager[859]: <info>  [1759212150.8204] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Sep 30 06:02:30 np0005461738.novalocal NetworkManager[859]: <info>  [1759212150.8383] manager[0x563b07448080]: rfkill: Wi-Fi hardware radio set enabled
Sep 30 06:02:30 np0005461738.novalocal NetworkManager[859]: <info>  [1759212150.8384] manager[0x563b07448080]: rfkill: WWAN hardware radio set enabled
Sep 30 06:02:30 np0005461738.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Sep 30 06:02:30 np0005461738.novalocal NetworkManager[859]: <info>  [1759212150.8484] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Sep 30 06:02:30 np0005461738.novalocal NetworkManager[859]: <info>  [1759212150.8485] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Sep 30 06:02:30 np0005461738.novalocal NetworkManager[859]: <info>  [1759212150.8485] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Sep 30 06:02:30 np0005461738.novalocal NetworkManager[859]: <info>  [1759212150.8486] manager: Networking is enabled by state file
Sep 30 06:02:30 np0005461738.novalocal NetworkManager[859]: <info>  [1759212150.8488] settings: Loaded settings plugin: keyfile (internal)
Sep 30 06:02:30 np0005461738.novalocal NetworkManager[859]: <info>  [1759212150.8533] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Sep 30 06:02:30 np0005461738.novalocal NetworkManager[859]: <info>  [1759212150.8557] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Sep 30 06:02:30 np0005461738.novalocal NetworkManager[859]: <info>  [1759212150.8586] dhcp: init: Using DHCP client 'internal'
Sep 30 06:02:30 np0005461738.novalocal NetworkManager[859]: <info>  [1759212150.8590] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Sep 30 06:02:30 np0005461738.novalocal NetworkManager[859]: <info>  [1759212150.8603] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 06:02:30 np0005461738.novalocal NetworkManager[859]: <info>  [1759212150.8614] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Sep 30 06:02:30 np0005461738.novalocal NetworkManager[859]: <info>  [1759212150.8621] device (lo): Activation: starting connection 'lo' (dd23f76c-752a-4e70-b19b-d6c1272b025e)
Sep 30 06:02:30 np0005461738.novalocal NetworkManager[859]: <info>  [1759212150.8630] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Sep 30 06:02:30 np0005461738.novalocal NetworkManager[859]: <info>  [1759212150.8633] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 06:02:30 np0005461738.novalocal NetworkManager[859]: <info>  [1759212150.8663] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Sep 30 06:02:30 np0005461738.novalocal NetworkManager[859]: <info>  [1759212150.8668] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Sep 30 06:02:30 np0005461738.novalocal NetworkManager[859]: <info>  [1759212150.8671] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Sep 30 06:02:30 np0005461738.novalocal NetworkManager[859]: <info>  [1759212150.8674] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Sep 30 06:02:30 np0005461738.novalocal NetworkManager[859]: <info>  [1759212150.8676] device (eth0): carrier: link connected
Sep 30 06:02:30 np0005461738.novalocal NetworkManager[859]: <info>  [1759212150.8680] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Sep 30 06:02:30 np0005461738.novalocal NetworkManager[859]: <info>  [1759212150.8686] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Sep 30 06:02:30 np0005461738.novalocal NetworkManager[859]: <info>  [1759212150.8692] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Sep 30 06:02:30 np0005461738.novalocal NetworkManager[859]: <info>  [1759212150.8697] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Sep 30 06:02:30 np0005461738.novalocal NetworkManager[859]: <info>  [1759212150.8698] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 06:02:30 np0005461738.novalocal NetworkManager[859]: <info>  [1759212150.8699] manager: NetworkManager state is now CONNECTING
Sep 30 06:02:30 np0005461738.novalocal NetworkManager[859]: <info>  [1759212150.8701] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 06:02:30 np0005461738.novalocal NetworkManager[859]: <info>  [1759212150.8708] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 06:02:30 np0005461738.novalocal NetworkManager[859]: <info>  [1759212150.8710] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Sep 30 06:02:30 np0005461738.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Sep 30 06:02:30 np0005461738.novalocal systemd[1]: Started Network Manager.
Sep 30 06:02:30 np0005461738.novalocal systemd[1]: Reached target Network.
Sep 30 06:02:30 np0005461738.novalocal systemd[1]: Starting Network Manager Wait Online...
Sep 30 06:02:30 np0005461738.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Sep 30 06:02:30 np0005461738.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Sep 30 06:02:30 np0005461738.novalocal NetworkManager[859]: <info>  [1759212150.9050] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Sep 30 06:02:30 np0005461738.novalocal NetworkManager[859]: <info>  [1759212150.9053] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Sep 30 06:02:30 np0005461738.novalocal NetworkManager[859]: <info>  [1759212150.9061] device (lo): Activation: successful, device activated.
Sep 30 06:02:30 np0005461738.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Sep 30 06:02:30 np0005461738.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Sep 30 06:02:30 np0005461738.novalocal systemd[1]: Reached target NFS client services.
Sep 30 06:02:30 np0005461738.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Sep 30 06:02:30 np0005461738.novalocal systemd[1]: Reached target Remote File Systems.
Sep 30 06:02:30 np0005461738.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Sep 30 06:02:33 np0005461738.novalocal NetworkManager[859]: <info>  [1759212153.6028] dhcp4 (eth0): state changed new lease, address=38.102.83.22
Sep 30 06:02:33 np0005461738.novalocal NetworkManager[859]: <info>  [1759212153.6046] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Sep 30 06:02:33 np0005461738.novalocal NetworkManager[859]: <info>  [1759212153.6081] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 06:02:33 np0005461738.novalocal NetworkManager[859]: <info>  [1759212153.6117] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 06:02:33 np0005461738.novalocal NetworkManager[859]: <info>  [1759212153.6118] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 06:02:33 np0005461738.novalocal NetworkManager[859]: <info>  [1759212153.6122] manager: NetworkManager state is now CONNECTED_SITE
Sep 30 06:02:33 np0005461738.novalocal NetworkManager[859]: <info>  [1759212153.6126] device (eth0): Activation: successful, device activated.
Sep 30 06:02:33 np0005461738.novalocal NetworkManager[859]: <info>  [1759212153.6132] manager: NetworkManager state is now CONNECTED_GLOBAL
Sep 30 06:02:33 np0005461738.novalocal NetworkManager[859]: <info>  [1759212153.6135] manager: startup complete
Sep 30 06:02:33 np0005461738.novalocal systemd[1]: Finished Network Manager Wait Online.
Sep 30 06:02:33 np0005461738.novalocal systemd[1]: Starting Cloud-init: Network Stage...
Sep 30 06:02:33 np0005461738.novalocal cloud-init[923]: Cloud-init v. 24.4-7.el9 running 'init' at Tue, 30 Sep 2025 06:02:33 +0000. Up 11.70 seconds.
Sep 30 06:02:33 np0005461738.novalocal cloud-init[923]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Sep 30 06:02:33 np0005461738.novalocal cloud-init[923]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Sep 30 06:02:33 np0005461738.novalocal cloud-init[923]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Sep 30 06:02:33 np0005461738.novalocal cloud-init[923]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Sep 30 06:02:33 np0005461738.novalocal cloud-init[923]: ci-info: |  eth0  | True |         38.102.83.22         | 255.255.255.0 | global | fa:16:3e:58:88:1d |
Sep 30 06:02:33 np0005461738.novalocal cloud-init[923]: ci-info: |  eth0  | True | fe80::f816:3eff:fe58:881d/64 |       .       |  link  | fa:16:3e:58:88:1d |
Sep 30 06:02:33 np0005461738.novalocal cloud-init[923]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Sep 30 06:02:33 np0005461738.novalocal cloud-init[923]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Sep 30 06:02:33 np0005461738.novalocal cloud-init[923]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Sep 30 06:02:33 np0005461738.novalocal cloud-init[923]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Sep 30 06:02:33 np0005461738.novalocal cloud-init[923]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Sep 30 06:02:33 np0005461738.novalocal cloud-init[923]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Sep 30 06:02:33 np0005461738.novalocal cloud-init[923]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Sep 30 06:02:33 np0005461738.novalocal cloud-init[923]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Sep 30 06:02:33 np0005461738.novalocal cloud-init[923]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Sep 30 06:02:33 np0005461738.novalocal cloud-init[923]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Sep 30 06:02:33 np0005461738.novalocal cloud-init[923]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Sep 30 06:02:33 np0005461738.novalocal cloud-init[923]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Sep 30 06:02:33 np0005461738.novalocal cloud-init[923]: ci-info: +-------+-------------+---------+-----------+-------+
Sep 30 06:02:33 np0005461738.novalocal cloud-init[923]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Sep 30 06:02:33 np0005461738.novalocal cloud-init[923]: ci-info: +-------+-------------+---------+-----------+-------+
Sep 30 06:02:33 np0005461738.novalocal cloud-init[923]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Sep 30 06:02:33 np0005461738.novalocal cloud-init[923]: ci-info: |   3   |    local    |    ::   |    eth0   |   U   |
Sep 30 06:02:34 np0005461738.novalocal cloud-init[923]: ci-info: |   4   |  multicast  |    ::   |    eth0   |   U   |
Sep 30 06:02:34 np0005461738.novalocal cloud-init[923]: ci-info: +-------+-------------+---------+-----------+-------+
Sep 30 06:02:34 np0005461738.novalocal useradd[990]: new group: name=cloud-user, GID=1001
Sep 30 06:02:34 np0005461738.novalocal useradd[990]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Sep 30 06:02:34 np0005461738.novalocal useradd[990]: add 'cloud-user' to group 'adm'
Sep 30 06:02:34 np0005461738.novalocal useradd[990]: add 'cloud-user' to group 'systemd-journal'
Sep 30 06:02:34 np0005461738.novalocal useradd[990]: add 'cloud-user' to shadow group 'adm'
Sep 30 06:02:34 np0005461738.novalocal useradd[990]: add 'cloud-user' to shadow group 'systemd-journal'
Sep 30 06:02:35 np0005461738.novalocal cloud-init[923]: Generating public/private rsa key pair.
Sep 30 06:02:35 np0005461738.novalocal cloud-init[923]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Sep 30 06:02:35 np0005461738.novalocal cloud-init[923]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Sep 30 06:02:35 np0005461738.novalocal cloud-init[923]: The key fingerprint is:
Sep 30 06:02:35 np0005461738.novalocal cloud-init[923]: SHA256:z975eOtEVtjKk2FqFsgRhD0ywe4l/IygGgZ31svSymA root@np0005461738.novalocal
Sep 30 06:02:35 np0005461738.novalocal cloud-init[923]: The key's randomart image is:
Sep 30 06:02:35 np0005461738.novalocal cloud-init[923]: +---[RSA 3072]----+
Sep 30 06:02:35 np0005461738.novalocal cloud-init[923]: |       ..=+.     |
Sep 30 06:02:35 np0005461738.novalocal cloud-init[923]: |        =.oo   o |
Sep 30 06:02:35 np0005461738.novalocal cloud-init[923]: |     . o oo.. + o|
Sep 30 06:02:35 np0005461738.novalocal cloud-init[923]: |. . o o + .  = = |
Sep 30 06:02:35 np0005461738.novalocal cloud-init[923]: | o o + +S*  + B  |
Sep 30 06:02:35 np0005461738.novalocal cloud-init[923]: |  E o + oooo o . |
Sep 30 06:02:35 np0005461738.novalocal cloud-init[923]: | o = o    o   .  |
Sep 30 06:02:35 np0005461738.novalocal cloud-init[923]: |  . o    . . +.  |
Sep 30 06:02:35 np0005461738.novalocal cloud-init[923]: |          . +++. |
Sep 30 06:02:35 np0005461738.novalocal cloud-init[923]: +----[SHA256]-----+
Sep 30 06:02:35 np0005461738.novalocal cloud-init[923]: Generating public/private ecdsa key pair.
Sep 30 06:02:35 np0005461738.novalocal cloud-init[923]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Sep 30 06:02:35 np0005461738.novalocal cloud-init[923]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Sep 30 06:02:35 np0005461738.novalocal cloud-init[923]: The key fingerprint is:
Sep 30 06:02:35 np0005461738.novalocal cloud-init[923]: SHA256:aB903MWNUqJdNjDgPjpaskpJOeTM/rQrBEvyAp7hMl4 root@np0005461738.novalocal
Sep 30 06:02:35 np0005461738.novalocal cloud-init[923]: The key's randomart image is:
Sep 30 06:02:35 np0005461738.novalocal cloud-init[923]: +---[ECDSA 256]---+
Sep 30 06:02:35 np0005461738.novalocal cloud-init[923]: |          ..++*o |
Sep 30 06:02:35 np0005461738.novalocal cloud-init[923]: |         o +.*o..|
Sep 30 06:02:35 np0005461738.novalocal cloud-init[923]: |    .   . = o.   |
Sep 30 06:02:35 np0005461738.novalocal cloud-init[923]: |o.o= . o o       |
Sep 30 06:02:35 np0005461738.novalocal cloud-init[923]: |++ooB o S o      |
Sep 30 06:02:35 np0005461738.novalocal cloud-init[923]: |++oE.+ . o .     |
Sep 30 06:02:35 np0005461738.novalocal cloud-init[923]: |oo..+ o =        |
Sep 30 06:02:35 np0005461738.novalocal cloud-init[923]: | . ..o * .       |
Sep 30 06:02:35 np0005461738.novalocal cloud-init[923]: |    .o*.         |
Sep 30 06:02:35 np0005461738.novalocal cloud-init[923]: +----[SHA256]-----+
Sep 30 06:02:35 np0005461738.novalocal cloud-init[923]: Generating public/private ed25519 key pair.
Sep 30 06:02:35 np0005461738.novalocal cloud-init[923]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Sep 30 06:02:35 np0005461738.novalocal cloud-init[923]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Sep 30 06:02:35 np0005461738.novalocal cloud-init[923]: The key fingerprint is:
Sep 30 06:02:35 np0005461738.novalocal cloud-init[923]: SHA256:N2R++/dW9VxEXqiLKAFa50w4hKxwKLL7vwTjwvByym0 root@np0005461738.novalocal
Sep 30 06:02:35 np0005461738.novalocal cloud-init[923]: The key's randomart image is:
Sep 30 06:02:35 np0005461738.novalocal cloud-init[923]: +--[ED25519 256]--+
Sep 30 06:02:35 np0005461738.novalocal cloud-init[923]: | o o..         oo|
Sep 30 06:02:35 np0005461738.novalocal cloud-init[923]: |= + = o       ..o|
Sep 30 06:02:35 np0005461738.novalocal cloud-init[923]: |++ o B    o  . ..|
Sep 30 06:02:35 np0005461738.novalocal cloud-init[923]: |o .   +  +  .   o|
Sep 30 06:02:35 np0005461738.novalocal cloud-init[923]: |..o    .S.+... .+|
Sep 30 06:02:35 np0005461738.novalocal cloud-init[923]: |+o o  . ...o..  +|
Sep 30 06:02:35 np0005461738.novalocal cloud-init[923]: |oo+ .  .    .   .|
Sep 30 06:02:35 np0005461738.novalocal cloud-init[923]: |.=oE         .  o|
Sep 30 06:02:35 np0005461738.novalocal cloud-init[923]: |...oo.        .oo|
Sep 30 06:02:35 np0005461738.novalocal cloud-init[923]: +----[SHA256]-----+
Sep 30 06:02:35 np0005461738.novalocal systemd[1]: Finished Cloud-init: Network Stage.
Sep 30 06:02:35 np0005461738.novalocal sm-notify[1006]: Version 2.5.4 starting
Sep 30 06:02:35 np0005461738.novalocal systemd[1]: Reached target Cloud-config availability.
Sep 30 06:02:35 np0005461738.novalocal sshd[1008]: Server listening on 0.0.0.0 port 22.
Sep 30 06:02:35 np0005461738.novalocal systemd[1]: Reached target Network is Online.
Sep 30 06:02:35 np0005461738.novalocal sshd[1008]: Server listening on :: port 22.
Sep 30 06:02:35 np0005461738.novalocal systemd[1]: Starting Cloud-init: Config Stage...
Sep 30 06:02:35 np0005461738.novalocal crond[1010]: (CRON) STARTUP (1.5.7)
Sep 30 06:02:35 np0005461738.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Sep 30 06:02:35 np0005461738.novalocal crond[1010]: (CRON) INFO (Syslog will be used instead of sendmail.)
Sep 30 06:02:35 np0005461738.novalocal systemd[1]: Starting System Logging Service...
Sep 30 06:02:35 np0005461738.novalocal crond[1010]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 4% if used.)
Sep 30 06:02:35 np0005461738.novalocal systemd[1]: Starting OpenSSH server daemon...
Sep 30 06:02:35 np0005461738.novalocal crond[1010]: (CRON) INFO (running with inotify support)
Sep 30 06:02:35 np0005461738.novalocal systemd[1]: Starting Permit User Sessions...
Sep 30 06:02:35 np0005461738.novalocal systemd[1]: Started Notify NFS peers of a restart.
Sep 30 06:02:35 np0005461738.novalocal systemd[1]: Started OpenSSH server daemon.
Sep 30 06:02:35 np0005461738.novalocal systemd[1]: Finished Permit User Sessions.
Sep 30 06:02:35 np0005461738.novalocal systemd[1]: Started Command Scheduler.
Sep 30 06:02:35 np0005461738.novalocal systemd[1]: Started Getty on tty1.
Sep 30 06:02:35 np0005461738.novalocal systemd[1]: Started Serial Getty on ttyS0.
Sep 30 06:02:35 np0005461738.novalocal systemd[1]: Reached target Login Prompts.
Sep 30 06:02:36 np0005461738.novalocal rsyslogd[1007]: [origin software="rsyslogd" swVersion="8.2506.0-2.el9" x-pid="1007" x-info="https://www.rsyslog.com"] start
Sep 30 06:02:36 np0005461738.novalocal rsyslogd[1007]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Sep 30 06:02:36 np0005461738.novalocal systemd[1]: Started System Logging Service.
Sep 30 06:02:36 np0005461738.novalocal systemd[1]: Reached target Multi-User System.
Sep 30 06:02:36 np0005461738.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Sep 30 06:02:36 np0005461738.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Sep 30 06:02:36 np0005461738.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Sep 30 06:02:36 np0005461738.novalocal rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 06:02:36 np0005461738.novalocal cloud-init[1019]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Tue, 30 Sep 2025 06:02:36 +0000. Up 14.01 seconds.
Sep 30 06:02:36 np0005461738.novalocal systemd[1]: Finished Cloud-init: Config Stage.
Sep 30 06:02:36 np0005461738.novalocal systemd[1]: Starting Cloud-init: Final Stage...
Sep 30 06:02:36 np0005461738.novalocal cloud-init[1024]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Tue, 30 Sep 2025 06:02:36 +0000. Up 14.43 seconds.
Sep 30 06:02:36 np0005461738.novalocal cloud-init[1027]: #############################################################
Sep 30 06:02:36 np0005461738.novalocal cloud-init[1028]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Sep 30 06:02:36 np0005461738.novalocal cloud-init[1031]: 256 SHA256:aB903MWNUqJdNjDgPjpaskpJOeTM/rQrBEvyAp7hMl4 root@np0005461738.novalocal (ECDSA)
Sep 30 06:02:36 np0005461738.novalocal cloud-init[1034]: 256 SHA256:N2R++/dW9VxEXqiLKAFa50w4hKxwKLL7vwTjwvByym0 root@np0005461738.novalocal (ED25519)
Sep 30 06:02:36 np0005461738.novalocal cloud-init[1036]: 3072 SHA256:z975eOtEVtjKk2FqFsgRhD0ywe4l/IygGgZ31svSymA root@np0005461738.novalocal (RSA)
Sep 30 06:02:36 np0005461738.novalocal sshd-session[1033]: Unable to negotiate with 38.102.83.114 port 50934: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Sep 30 06:02:36 np0005461738.novalocal cloud-init[1038]: -----END SSH HOST KEY FINGERPRINTS-----
Sep 30 06:02:36 np0005461738.novalocal cloud-init[1040]: #############################################################
Sep 30 06:02:36 np0005461738.novalocal cloud-init[1024]: Cloud-init v. 24.4-7.el9 finished at Tue, 30 Sep 2025 06:02:36 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 14.66 seconds
Sep 30 06:02:36 np0005461738.novalocal sshd-session[1025]: Connection closed by 38.102.83.114 port 50926 [preauth]
Sep 30 06:02:36 np0005461738.novalocal sshd-session[1045]: Unable to negotiate with 38.102.83.114 port 50944: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Sep 30 06:02:36 np0005461738.novalocal systemd[1]: Finished Cloud-init: Final Stage.
Sep 30 06:02:36 np0005461738.novalocal systemd[1]: Reached target Cloud-init target.
Sep 30 06:02:36 np0005461738.novalocal sshd-session[1047]: Unable to negotiate with 38.102.83.114 port 50946: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Sep 30 06:02:36 np0005461738.novalocal systemd[1]: Startup finished in 1.761s (kernel) + 3.511s (initrd) + 9.448s (userspace) = 14.721s.
Sep 30 06:02:36 np0005461738.novalocal sshd-session[1049]: Connection reset by 38.102.83.114 port 50950 [preauth]
Sep 30 06:02:36 np0005461738.novalocal sshd-session[1039]: Connection closed by 38.102.83.114 port 50942 [preauth]
Sep 30 06:02:37 np0005461738.novalocal sshd-session[1053]: Unable to negotiate with 38.102.83.114 port 50974: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Sep 30 06:02:37 np0005461738.novalocal sshd-session[1055]: Unable to negotiate with 38.102.83.114 port 50990: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Sep 30 06:02:37 np0005461738.novalocal sshd-session[1051]: Connection closed by 38.102.83.114 port 50958 [preauth]
Sep 30 06:02:38 np0005461738.novalocal chronyd[831]: Selected source 206.108.0.133 (2.centos.pool.ntp.org)
Sep 30 06:02:38 np0005461738.novalocal chronyd[831]: System clock TAI offset set to 37 seconds
Sep 30 06:02:40 np0005461738.novalocal irqbalance[819]: Cannot change IRQ 35 affinity: Operation not permitted
Sep 30 06:02:40 np0005461738.novalocal irqbalance[819]: IRQ 35 affinity is now unmanaged
Sep 30 06:02:40 np0005461738.novalocal irqbalance[819]: Cannot change IRQ 33 affinity: Operation not permitted
Sep 30 06:02:40 np0005461738.novalocal irqbalance[819]: IRQ 33 affinity is now unmanaged
Sep 30 06:02:40 np0005461738.novalocal irqbalance[819]: Cannot change IRQ 31 affinity: Operation not permitted
Sep 30 06:02:40 np0005461738.novalocal irqbalance[819]: IRQ 31 affinity is now unmanaged
Sep 30 06:02:40 np0005461738.novalocal irqbalance[819]: Cannot change IRQ 28 affinity: Operation not permitted
Sep 30 06:02:40 np0005461738.novalocal irqbalance[819]: IRQ 28 affinity is now unmanaged
Sep 30 06:02:40 np0005461738.novalocal irqbalance[819]: Cannot change IRQ 34 affinity: Operation not permitted
Sep 30 06:02:40 np0005461738.novalocal irqbalance[819]: IRQ 34 affinity is now unmanaged
Sep 30 06:02:40 np0005461738.novalocal irqbalance[819]: Cannot change IRQ 32 affinity: Operation not permitted
Sep 30 06:02:40 np0005461738.novalocal irqbalance[819]: IRQ 32 affinity is now unmanaged
Sep 30 06:02:40 np0005461738.novalocal irqbalance[819]: Cannot change IRQ 30 affinity: Operation not permitted
Sep 30 06:02:40 np0005461738.novalocal irqbalance[819]: IRQ 30 affinity is now unmanaged
Sep 30 06:02:40 np0005461738.novalocal irqbalance[819]: Cannot change IRQ 29 affinity: Operation not permitted
Sep 30 06:02:40 np0005461738.novalocal irqbalance[819]: IRQ 29 affinity is now unmanaged
Sep 30 06:02:43 np0005461738.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Sep 30 06:03:00 np0005461738.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Sep 30 06:03:34 np0005461738.novalocal sshd-session[1061]: Received disconnect from 193.46.255.217 port 13122:11:  [preauth]
Sep 30 06:03:34 np0005461738.novalocal sshd-session[1061]: Disconnected from authenticating user root 193.46.255.217 port 13122 [preauth]
Sep 30 06:04:56 np0005461738.novalocal sshd[1008]: Timeout before authentication for connection from 101.126.138.155 to 38.102.83.22, pid = 1057
Sep 30 06:05:51 np0005461738.novalocal sshd-session[1064]: Invalid user hacluster from 194.0.234.19 port 47920
Sep 30 06:05:52 np0005461738.novalocal sshd-session[1064]: Connection closed by invalid user hacluster 194.0.234.19 port 47920 [preauth]
Sep 30 06:06:26 np0005461738.novalocal sshd[1008]: Timeout before authentication for connection from 120.48.170.78 to 38.102.83.22, pid = 1063
Sep 30 06:08:43 np0005461738.novalocal sshd[1008]: Timeout before authentication for connection from 14.103.107.50 to 38.102.83.22, pid = 1066
Sep 30 06:09:10 np0005461738.novalocal sshd-session[1071]: Received disconnect from 141.98.11.34 port 21874:11:  [preauth]
Sep 30 06:09:10 np0005461738.novalocal sshd-session[1071]: Disconnected from authenticating user root 141.98.11.34 port 21874 [preauth]
Sep 30 06:11:26 np0005461738.novalocal sshd-session[1073]: Invalid user Administrator from 185.156.73.233 port 44770
Sep 30 06:11:27 np0005461738.novalocal sshd-session[1073]: Connection closed by invalid user Administrator 185.156.73.233 port 44770 [preauth]
Sep 30 06:13:37 np0005461738.novalocal systemd[1]: Starting dnf makecache...
Sep 30 06:13:37 np0005461738.novalocal dnf[1078]: Failed determining last makecache time.
Sep 30 06:13:37 np0005461738.novalocal dnf[1078]: CentOS Stream 9 - BaseOS                         59 kB/s | 7.0 kB     00:00
Sep 30 06:13:38 np0005461738.novalocal dnf[1078]: CentOS Stream 9 - AppStream                      30 kB/s | 7.1 kB     00:00
Sep 30 06:13:39 np0005461738.novalocal dnf[1078]: CentOS Stream 9 - CRB                            28 kB/s | 6.9 kB     00:00
Sep 30 06:13:40 np0005461738.novalocal dnf[1078]: CentOS Stream 9 - Extras packages                13 kB/s | 8.0 kB     00:00
Sep 30 06:13:40 np0005461738.novalocal dnf[1078]: Metadata cache created.
Sep 30 06:13:40 np0005461738.novalocal systemd[1]: dnf-makecache.service: Deactivated successfully.
Sep 30 06:13:40 np0005461738.novalocal systemd[1]: Finished dnf makecache.
Sep 30 06:14:05 np0005461738.novalocal sshd-session[1087]: Invalid user bitwarden from 152.32.253.152 port 46710
Sep 30 06:14:06 np0005461738.novalocal sshd-session[1087]: Received disconnect from 152.32.253.152 port 46710:11: Bye Bye [preauth]
Sep 30 06:14:06 np0005461738.novalocal sshd-session[1087]: Disconnected from invalid user bitwarden 152.32.253.152 port 46710 [preauth]
Sep 30 06:14:09 np0005461738.novalocal sshd[1008]: Timeout before authentication for connection from 120.48.170.78 to 38.102.83.22, pid = 1075
Sep 30 06:14:57 np0005461738.novalocal sshd-session[1090]: Received disconnect from 80.94.93.233 port 24468:11:  [preauth]
Sep 30 06:14:57 np0005461738.novalocal sshd-session[1090]: Disconnected from authenticating user root 80.94.93.233 port 24468 [preauth]
Sep 30 06:15:03 np0005461738.novalocal sshd-session[1092]: Accepted publickey for zuul from 38.102.83.114 port 58878 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Sep 30 06:15:03 np0005461738.novalocal systemd[1]: Created slice User Slice of UID 1000.
Sep 30 06:15:03 np0005461738.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Sep 30 06:15:03 np0005461738.novalocal systemd-logind[824]: New session 1 of user zuul.
Sep 30 06:15:03 np0005461738.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Sep 30 06:15:03 np0005461738.novalocal systemd[1]: Starting User Manager for UID 1000...
Sep 30 06:15:03 np0005461738.novalocal systemd[1096]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 06:15:03 np0005461738.novalocal systemd[1096]: Queued start job for default target Main User Target.
Sep 30 06:15:03 np0005461738.novalocal systemd[1096]: Created slice User Application Slice.
Sep 30 06:15:03 np0005461738.novalocal systemd[1096]: Started Mark boot as successful after the user session has run 2 minutes.
Sep 30 06:15:03 np0005461738.novalocal systemd[1096]: Started Daily Cleanup of User's Temporary Directories.
Sep 30 06:15:03 np0005461738.novalocal systemd[1096]: Reached target Paths.
Sep 30 06:15:03 np0005461738.novalocal systemd[1096]: Reached target Timers.
Sep 30 06:15:03 np0005461738.novalocal systemd[1096]: Starting D-Bus User Message Bus Socket...
Sep 30 06:15:03 np0005461738.novalocal systemd[1096]: Starting Create User's Volatile Files and Directories...
Sep 30 06:15:03 np0005461738.novalocal systemd[1096]: Finished Create User's Volatile Files and Directories.
Sep 30 06:15:03 np0005461738.novalocal systemd[1096]: Listening on D-Bus User Message Bus Socket.
Sep 30 06:15:03 np0005461738.novalocal systemd[1096]: Reached target Sockets.
Sep 30 06:15:03 np0005461738.novalocal systemd[1096]: Reached target Basic System.
Sep 30 06:15:03 np0005461738.novalocal systemd[1096]: Reached target Main User Target.
Sep 30 06:15:03 np0005461738.novalocal systemd[1096]: Startup finished in 158ms.
Sep 30 06:15:04 np0005461738.novalocal systemd[1]: Started User Manager for UID 1000.
Sep 30 06:15:04 np0005461738.novalocal systemd[1]: Started Session 1 of User zuul.
Sep 30 06:15:04 np0005461738.novalocal sshd-session[1092]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 06:15:04 np0005461738.novalocal python3[1180]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 06:15:07 np0005461738.novalocal python3[1208]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 06:15:14 np0005461738.novalocal python3[1266]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 06:15:15 np0005461738.novalocal python3[1306]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Sep 30 06:15:17 np0005461738.novalocal python3[1332]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDlA/fNsn84avkJNyBpqpQy/mWVaQ2SfpZwb4URl6BlSd6KWPXaV/gsLzVs1opiWZVYh9lmXLFoj9Ev9cns1M7enYcG3Bl8PHtHAM6xb4vxT6a1j8p8T//vWzbazVU2TePOe+aKz0OXK38eupjBI8YstzEnuT6s6B2F9T+VyYlgozuxdXalQWqEMMFBnsm/6XUJkWl4iBw04KFZluX1upiQjf6h9kVKnvvFFcsYFZ+eWjV632u97jHdWmCWFSJEI8L/67l2B3deGJzShb5gvglb/8d4zD6nCxTOoHIe5wo9M/2IF3eW5rrjIdgQASBZClRBCZbsdvf1aadwkXYMxtFNOItH/m0QZX4KoejCxlURHK6BwJadpxolTvVowz3j9a37ak4RJ2723BzCLt0AwLjsUrQY841AJ6aUHVc/AoI76gVu6M/AxliQGykLB8thJD3Qo7YtcI1EoqEzxA5b/DlL/njoWsPP0E2YUJKhJBNyO+lG3z/Zuv71atxelHjeFnE= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 06:15:17 np0005461738.novalocal python3[1356]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:15:18 np0005461738.novalocal python3[1455]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 06:15:18 np0005461738.novalocal python3[1526]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759212918.0054839-229-268629283092390/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=7f47d7c16caa45d2974962c59889ca32_id_rsa follow=False checksum=6683f8cab12fc4d1b30abddc951a1261005caf96 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:15:19 np0005461738.novalocal python3[1649]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 06:15:19 np0005461738.novalocal python3[1720]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759212918.9974098-273-30452458602535/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=7f47d7c16caa45d2974962c59889ca32_id_rsa.pub follow=False checksum=f2cc4e0aa7d03132eaf21fc1701f0b83eaae6b0e backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:15:21 np0005461738.novalocal python3[1768]: ansible-ping Invoked with data=pong
Sep 30 06:15:22 np0005461738.novalocal python3[1792]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 06:15:23 np0005461738.novalocal python3[1850]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Sep 30 06:15:24 np0005461738.novalocal python3[1882]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:15:25 np0005461738.novalocal python3[1906]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:15:25 np0005461738.novalocal python3[1930]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:15:25 np0005461738.novalocal python3[1954]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:15:26 np0005461738.novalocal python3[1978]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:15:26 np0005461738.novalocal python3[2002]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:15:27 np0005461738.novalocal sudo[2026]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhfnksriemqjpwqnkclaueirpcqmelxn ; /usr/bin/python3'
Sep 30 06:15:27 np0005461738.novalocal sudo[2026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:15:27 np0005461738.novalocal python3[2028]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:15:27 np0005461738.novalocal sudo[2026]: pam_unix(sudo:session): session closed for user root
Sep 30 06:15:28 np0005461738.novalocal sudo[2104]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsctiyhktgkpplfieooyenejupqkbddl ; /usr/bin/python3'
Sep 30 06:15:28 np0005461738.novalocal sudo[2104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:15:28 np0005461738.novalocal python3[2106]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 06:15:28 np0005461738.novalocal sudo[2104]: pam_unix(sudo:session): session closed for user root
Sep 30 06:15:28 np0005461738.novalocal sudo[2177]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvhtsoeldphouborceszjmsuqsfzwgpj ; /usr/bin/python3'
Sep 30 06:15:28 np0005461738.novalocal sudo[2177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:15:29 np0005461738.novalocal python3[2179]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759212928.05508-26-181798686348133/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:15:29 np0005461738.novalocal sudo[2177]: pam_unix(sudo:session): session closed for user root
Sep 30 06:15:29 np0005461738.novalocal python3[2227]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 06:15:30 np0005461738.novalocal python3[2251]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 06:15:30 np0005461738.novalocal python3[2275]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 06:15:30 np0005461738.novalocal python3[2299]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 06:15:30 np0005461738.novalocal python3[2323]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 06:15:31 np0005461738.novalocal python3[2347]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 06:15:31 np0005461738.novalocal python3[2371]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 06:15:31 np0005461738.novalocal python3[2395]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 06:15:32 np0005461738.novalocal python3[2419]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 06:15:32 np0005461738.novalocal python3[2443]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 06:15:32 np0005461738.novalocal python3[2467]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 06:15:32 np0005461738.novalocal python3[2491]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 06:15:33 np0005461738.novalocal python3[2515]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 06:15:33 np0005461738.novalocal python3[2539]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 06:15:33 np0005461738.novalocal python3[2563]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 06:15:34 np0005461738.novalocal python3[2587]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 06:15:34 np0005461738.novalocal python3[2611]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 06:15:34 np0005461738.novalocal python3[2635]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 06:15:35 np0005461738.novalocal python3[2659]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 06:15:35 np0005461738.novalocal python3[2683]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 06:15:35 np0005461738.novalocal python3[2707]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 06:15:36 np0005461738.novalocal python3[2731]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 06:15:36 np0005461738.novalocal python3[2755]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 06:15:36 np0005461738.novalocal python3[2779]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 06:15:37 np0005461738.novalocal python3[2803]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 06:15:37 np0005461738.novalocal python3[2827]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 06:15:39 np0005461738.novalocal sudo[2851]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjrpibfnnlvgyixfgonijvziewrqwugk ; /usr/bin/python3'
Sep 30 06:15:39 np0005461738.novalocal sudo[2851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:15:39 np0005461738.novalocal python3[2853]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Sep 30 06:15:39 np0005461738.novalocal systemd[1]: Starting Time & Date Service...
Sep 30 06:15:39 np0005461738.novalocal systemd[1]: Started Time & Date Service.
Sep 30 06:15:39 np0005461738.novalocal systemd-timedated[2855]: Changed time zone to 'UTC' (UTC).
Sep 30 06:15:39 np0005461738.novalocal sudo[2851]: pam_unix(sudo:session): session closed for user root
Sep 30 06:15:39 np0005461738.novalocal sudo[2882]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avlldauvjlrexajoyxblywejtvhuvxfh ; /usr/bin/python3'
Sep 30 06:15:39 np0005461738.novalocal sudo[2882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:15:40 np0005461738.novalocal python3[2884]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:15:40 np0005461738.novalocal sudo[2882]: pam_unix(sudo:session): session closed for user root
Sep 30 06:15:40 np0005461738.novalocal python3[2960]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 06:15:40 np0005461738.novalocal python3[3031]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1759212940.2076037-202-34743975067672/source _original_basename=tmpp4a0z3ji follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:15:41 np0005461738.novalocal python3[3131]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 06:15:41 np0005461738.novalocal python3[3202]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1759212941.1160731-242-248724432255121/source _original_basename=tmpvsox7x9g follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:15:42 np0005461738.novalocal sudo[3302]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxnmrkgfmhtrynghkmouxoajhopcnvyu ; /usr/bin/python3'
Sep 30 06:15:42 np0005461738.novalocal sudo[3302]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:15:42 np0005461738.novalocal python3[3304]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 06:15:42 np0005461738.novalocal sudo[3302]: pam_unix(sudo:session): session closed for user root
Sep 30 06:15:42 np0005461738.novalocal sudo[3375]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-flgjcubwvgsaxwdfxgqaussolnjgpzlr ; /usr/bin/python3'
Sep 30 06:15:42 np0005461738.novalocal sudo[3375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:15:42 np0005461738.novalocal python3[3377]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1759212942.2610414-306-153096125061898/source _original_basename=tmpp62zc9ls follow=False checksum=0a5264336eaf669ce906803fabc64043ef3757da backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:15:42 np0005461738.novalocal sudo[3375]: pam_unix(sudo:session): session closed for user root
Sep 30 06:15:43 np0005461738.novalocal python3[3425]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:15:43 np0005461738.novalocal python3[3451]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:15:44 np0005461738.novalocal sudo[3529]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-orilgwajcglsreuanwrcpszihosxwvti ; /usr/bin/python3'
Sep 30 06:15:44 np0005461738.novalocal sudo[3529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:15:44 np0005461738.novalocal python3[3531]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 06:15:44 np0005461738.novalocal sudo[3529]: pam_unix(sudo:session): session closed for user root
Sep 30 06:15:44 np0005461738.novalocal sudo[3602]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gorlfdvfkqckjafmwlckurgxfwalfddy ; /usr/bin/python3'
Sep 30 06:15:44 np0005461738.novalocal sudo[3602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:15:44 np0005461738.novalocal python3[3604]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1759212943.9598875-362-208118948414252/source _original_basename=tmpgx327m2g follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:15:44 np0005461738.novalocal sudo[3602]: pam_unix(sudo:session): session closed for user root
Sep 30 06:15:45 np0005461738.novalocal sudo[3653]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-znpyfnvuljlmpsgbwkupbfatuiwvhwsl ; /usr/bin/python3'
Sep 30 06:15:45 np0005461738.novalocal sudo[3653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:15:45 np0005461738.novalocal python3[3655]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163e3b-3c83-b3fe-6b76-00000000001e-1-compute0 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:15:45 np0005461738.novalocal sudo[3653]: pam_unix(sudo:session): session closed for user root
Sep 30 06:15:45 np0005461738.novalocal python3[3683]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163e3b-3c83-b3fe-6b76-00000000001f-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Sep 30 06:15:47 np0005461738.novalocal python3[3711]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:16:01 np0005461738.novalocal sshd-session[3712]: Connection closed by 120.48.170.78 port 45498 [preauth]
Sep 30 06:16:08 np0005461738.novalocal sudo[3737]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxxybcjpgcqriohqtvodrutdwnfocjfg ; /usr/bin/python3'
Sep 30 06:16:08 np0005461738.novalocal sudo[3737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:16:09 np0005461738.novalocal python3[3739]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:16:09 np0005461738.novalocal sudo[3737]: pam_unix(sudo:session): session closed for user root
Sep 30 06:16:09 np0005461738.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Sep 30 06:16:44 np0005461738.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Sep 30 06:16:44 np0005461738.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Sep 30 06:16:44 np0005461738.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Sep 30 06:16:44 np0005461738.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Sep 30 06:16:44 np0005461738.novalocal kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Sep 30 06:16:44 np0005461738.novalocal kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Sep 30 06:16:44 np0005461738.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Sep 30 06:16:44 np0005461738.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Sep 30 06:16:44 np0005461738.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Sep 30 06:16:44 np0005461738.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Sep 30 06:16:44 np0005461738.novalocal NetworkManager[859]: <info>  [1759213004.6097] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Sep 30 06:16:44 np0005461738.novalocal systemd-udevd[3742]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 06:16:44 np0005461738.novalocal NetworkManager[859]: <info>  [1759213004.6436] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 06:16:44 np0005461738.novalocal NetworkManager[859]: <info>  [1759213004.6478] settings: (eth1): created default wired connection 'Wired connection 1'
Sep 30 06:16:44 np0005461738.novalocal NetworkManager[859]: <info>  [1759213004.6484] device (eth1): carrier: link connected
Sep 30 06:16:44 np0005461738.novalocal NetworkManager[859]: <info>  [1759213004.6487] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Sep 30 06:16:44 np0005461738.novalocal NetworkManager[859]: <info>  [1759213004.6496] policy: auto-activating connection 'Wired connection 1' (78674247-54a0-3097-b9a3-7d78780bff1a)
Sep 30 06:16:44 np0005461738.novalocal NetworkManager[859]: <info>  [1759213004.6501] device (eth1): Activation: starting connection 'Wired connection 1' (78674247-54a0-3097-b9a3-7d78780bff1a)
Sep 30 06:16:44 np0005461738.novalocal NetworkManager[859]: <info>  [1759213004.6503] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 06:16:44 np0005461738.novalocal NetworkManager[859]: <info>  [1759213004.6506] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 06:16:44 np0005461738.novalocal NetworkManager[859]: <info>  [1759213004.6512] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 06:16:44 np0005461738.novalocal NetworkManager[859]: <info>  [1759213004.6519] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Sep 30 06:16:45 np0005461738.novalocal python3[3769]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163e3b-3c83-4791-38ba-000000000112-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:16:52 np0005461738.novalocal sudo[3848]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekuhzugtobulvorabolfuzowrtdgrjov ; OS_CLOUD=vexxhost /usr/bin/python3'
Sep 30 06:16:52 np0005461738.novalocal sudo[3848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:16:52 np0005461738.novalocal python3[3850]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 06:16:52 np0005461738.novalocal sudo[3848]: pam_unix(sudo:session): session closed for user root
Sep 30 06:16:52 np0005461738.novalocal sudo[3921]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxcqgnphdylgrsuyefbzjmpkwflwapnf ; OS_CLOUD=vexxhost /usr/bin/python3'
Sep 30 06:16:52 np0005461738.novalocal sudo[3921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:16:52 np0005461738.novalocal python3[3923]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759213012.078953-103-232754301045925/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=01e617fcd44182aea761f4ac1f6afa5a9696044a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:16:52 np0005461738.novalocal sudo[3921]: pam_unix(sudo:session): session closed for user root
Sep 30 06:16:53 np0005461738.novalocal sudo[3971]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iygtmwvhjrgblemgnyililpemwggdylc ; OS_CLOUD=vexxhost /usr/bin/python3'
Sep 30 06:16:53 np0005461738.novalocal sudo[3971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:16:53 np0005461738.novalocal python3[3973]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 06:16:53 np0005461738.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Sep 30 06:16:53 np0005461738.novalocal systemd[1]: Stopped Network Manager Wait Online.
Sep 30 06:16:53 np0005461738.novalocal systemd[1]: Stopping Network Manager Wait Online...
Sep 30 06:16:53 np0005461738.novalocal systemd[1]: Stopping Network Manager...
Sep 30 06:16:53 np0005461738.novalocal NetworkManager[859]: <info>  [1759213013.7876] caught SIGTERM, shutting down normally.
Sep 30 06:16:53 np0005461738.novalocal NetworkManager[859]: <info>  [1759213013.7887] dhcp4 (eth0): canceled DHCP transaction
Sep 30 06:16:53 np0005461738.novalocal NetworkManager[859]: <info>  [1759213013.7887] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Sep 30 06:16:53 np0005461738.novalocal NetworkManager[859]: <info>  [1759213013.7887] dhcp4 (eth0): state changed no lease
Sep 30 06:16:53 np0005461738.novalocal NetworkManager[859]: <info>  [1759213013.7890] manager: NetworkManager state is now CONNECTING
Sep 30 06:16:53 np0005461738.novalocal NetworkManager[859]: <info>  [1759213013.7974] dhcp4 (eth1): canceled DHCP transaction
Sep 30 06:16:53 np0005461738.novalocal NetworkManager[859]: <info>  [1759213013.7974] dhcp4 (eth1): state changed no lease
Sep 30 06:16:53 np0005461738.novalocal NetworkManager[859]: <info>  [1759213013.8022] exiting (success)
Sep 30 06:16:53 np0005461738.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Sep 30 06:16:53 np0005461738.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Sep 30 06:16:53 np0005461738.novalocal systemd[1]: Stopped Network Manager.
Sep 30 06:16:53 np0005461738.novalocal systemd[1]: NetworkManager.service: Consumed 6.617s CPU time, 10.1M memory peak.
Sep 30 06:16:53 np0005461738.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Sep 30 06:16:53 np0005461738.novalocal systemd[1]: Starting Network Manager...
Sep 30 06:16:53 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213013.8701] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:5ec163b0-1932-4293-bd17-8c478fff576e)
Sep 30 06:16:53 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213013.8705] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Sep 30 06:16:53 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213013.8779] manager[0x5559d1b3f070]: monitoring kernel firmware directory '/lib/firmware'.
Sep 30 06:16:53 np0005461738.novalocal systemd[1]: Starting Hostname Service...
Sep 30 06:16:53 np0005461738.novalocal systemd[1]: Started Hostname Service.
Sep 30 06:16:53 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213013.9929] hostname: hostname: using hostnamed
Sep 30 06:16:53 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213013.9930] hostname: static hostname changed from (none) to "np0005461738.novalocal"
Sep 30 06:16:53 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213013.9939] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Sep 30 06:16:53 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213013.9946] manager[0x5559d1b3f070]: rfkill: Wi-Fi hardware radio set enabled
Sep 30 06:16:53 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213013.9947] manager[0x5559d1b3f070]: rfkill: WWAN hardware radio set enabled
Sep 30 06:16:53 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213013.9995] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Sep 30 06:16:53 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213013.9996] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Sep 30 06:16:53 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213013.9997] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Sep 30 06:16:53 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213013.9998] manager: Networking is enabled by state file
Sep 30 06:16:54 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213014.0001] settings: Loaded settings plugin: keyfile (internal)
Sep 30 06:16:54 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213014.0008] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Sep 30 06:16:54 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213014.0058] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Sep 30 06:16:54 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213014.0076] dhcp: init: Using DHCP client 'internal'
Sep 30 06:16:54 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213014.0081] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Sep 30 06:16:54 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213014.0090] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 06:16:54 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213014.0101] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Sep 30 06:16:54 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213014.0116] device (lo): Activation: starting connection 'lo' (dd23f76c-752a-4e70-b19b-d6c1272b025e)
Sep 30 06:16:54 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213014.0129] device (eth0): carrier: link connected
Sep 30 06:16:54 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213014.0139] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Sep 30 06:16:54 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213014.0150] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Sep 30 06:16:54 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213014.0151] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Sep 30 06:16:54 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213014.0167] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Sep 30 06:16:54 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213014.0180] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Sep 30 06:16:54 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213014.0191] device (eth1): carrier: link connected
Sep 30 06:16:54 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213014.0198] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Sep 30 06:16:54 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213014.0206] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (78674247-54a0-3097-b9a3-7d78780bff1a) (indicated)
Sep 30 06:16:54 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213014.0207] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Sep 30 06:16:54 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213014.0216] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Sep 30 06:16:54 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213014.0227] device (eth1): Activation: starting connection 'Wired connection 1' (78674247-54a0-3097-b9a3-7d78780bff1a)
Sep 30 06:16:54 np0005461738.novalocal systemd[1]: Started Network Manager.
Sep 30 06:16:54 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213014.0245] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Sep 30 06:16:54 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213014.0252] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Sep 30 06:16:54 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213014.0256] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Sep 30 06:16:54 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213014.0258] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Sep 30 06:16:54 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213014.0260] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Sep 30 06:16:54 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213014.0264] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Sep 30 06:16:54 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213014.0267] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Sep 30 06:16:54 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213014.0270] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Sep 30 06:16:54 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213014.0274] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Sep 30 06:16:54 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213014.0285] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Sep 30 06:16:54 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213014.0288] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Sep 30 06:16:54 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213014.0298] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Sep 30 06:16:54 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213014.0300] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Sep 30 06:16:54 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213014.0321] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Sep 30 06:16:54 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213014.0322] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Sep 30 06:16:54 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213014.0328] device (lo): Activation: successful, device activated.
Sep 30 06:16:54 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213014.0338] dhcp4 (eth0): state changed new lease, address=38.102.83.22
Sep 30 06:16:54 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213014.0345] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Sep 30 06:16:54 np0005461738.novalocal systemd[1]: Starting Network Manager Wait Online...
Sep 30 06:16:54 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213014.0442] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Sep 30 06:16:54 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213014.0485] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Sep 30 06:16:54 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213014.0488] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Sep 30 06:16:54 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213014.0491] manager: NetworkManager state is now CONNECTED_SITE
Sep 30 06:16:54 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213014.0497] device (eth0): Activation: successful, device activated.
Sep 30 06:16:54 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213014.0503] manager: NetworkManager state is now CONNECTED_GLOBAL
Sep 30 06:16:54 np0005461738.novalocal sudo[3971]: pam_unix(sudo:session): session closed for user root
Sep 30 06:16:54 np0005461738.novalocal python3[4057]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163e3b-3c83-4791-38ba-0000000000b2-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:17:04 np0005461738.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Sep 30 06:17:08 np0005461738.novalocal sshd-session[4060]: Invalid user test from 152.32.253.152 port 35188
Sep 30 06:17:08 np0005461738.novalocal sshd-session[4060]: Received disconnect from 152.32.253.152 port 35188:11: Bye Bye [preauth]
Sep 30 06:17:08 np0005461738.novalocal sshd-session[4060]: Disconnected from invalid user test 152.32.253.152 port 35188 [preauth]
Sep 30 06:17:22 np0005461738.novalocal systemd[1]: Starting Cleanup of Temporary Directories...
Sep 30 06:17:22 np0005461738.novalocal systemd[1096]: Starting Mark boot as successful...
Sep 30 06:17:22 np0005461738.novalocal systemd[1096]: Finished Mark boot as successful.
Sep 30 06:17:22 np0005461738.novalocal systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Sep 30 06:17:22 np0005461738.novalocal systemd[1]: Finished Cleanup of Temporary Directories.
Sep 30 06:17:22 np0005461738.novalocal systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Sep 30 06:17:24 np0005461738.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Sep 30 06:17:39 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213059.2177] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Sep 30 06:17:39 np0005461738.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Sep 30 06:17:39 np0005461738.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Sep 30 06:17:39 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213059.2552] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Sep 30 06:17:39 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213059.2556] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Sep 30 06:17:39 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213059.2578] device (eth1): Activation: successful, device activated.
Sep 30 06:17:39 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213059.2589] manager: startup complete
Sep 30 06:17:39 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213059.2594] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Sep 30 06:17:39 np0005461738.novalocal NetworkManager[3981]: <warn>  [1759213059.2615] device (eth1): Activation: failed for connection 'Wired connection 1'
Sep 30 06:17:39 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213059.2625] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Sep 30 06:17:39 np0005461738.novalocal systemd[1]: Finished Network Manager Wait Online.
Sep 30 06:17:39 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213059.2774] dhcp4 (eth1): canceled DHCP transaction
Sep 30 06:17:39 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213059.2775] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Sep 30 06:17:39 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213059.2775] dhcp4 (eth1): state changed no lease
Sep 30 06:17:39 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213059.2795] policy: auto-activating connection 'ci-private-network' (1fa3647f-a0b3-57b1-8a07-78f0592e2b89)
Sep 30 06:17:39 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213059.2802] device (eth1): Activation: starting connection 'ci-private-network' (1fa3647f-a0b3-57b1-8a07-78f0592e2b89)
Sep 30 06:17:39 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213059.2803] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 06:17:39 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213059.2807] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 06:17:39 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213059.2817] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 06:17:39 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213059.2829] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 06:17:39 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213059.2881] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 06:17:39 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213059.2884] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 06:17:39 np0005461738.novalocal NetworkManager[3981]: <info>  [1759213059.2892] device (eth1): Activation: successful, device activated.
Sep 30 06:17:49 np0005461738.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Sep 30 06:17:54 np0005461738.novalocal sshd-session[1107]: Received disconnect from 38.102.83.114 port 58878:11: disconnected by user
Sep 30 06:17:54 np0005461738.novalocal sshd-session[1107]: Disconnected from user zuul 38.102.83.114 port 58878
Sep 30 06:17:54 np0005461738.novalocal sshd-session[1092]: pam_unix(sshd:session): session closed for user zuul
Sep 30 06:17:54 np0005461738.novalocal systemd-logind[824]: Session 1 logged out. Waiting for processes to exit.
Sep 30 06:18:14 np0005461738.novalocal sshd-session[4093]: Invalid user mysql from 152.32.253.152 port 58736
Sep 30 06:18:14 np0005461738.novalocal sshd-session[4093]: Received disconnect from 152.32.253.152 port 58736:11: Bye Bye [preauth]
Sep 30 06:18:14 np0005461738.novalocal sshd-session[4093]: Disconnected from invalid user mysql 152.32.253.152 port 58736 [preauth]
Sep 30 06:18:17 np0005461738.novalocal sshd-session[4095]: Accepted publickey for zuul from 38.102.83.114 port 57448 ssh2: RSA SHA256:C4HSx/cRfeCq/OvvwsL+6J5kLvcqUebsiQoFmFCWCHY
Sep 30 06:18:17 np0005461738.novalocal systemd-logind[824]: New session 3 of user zuul.
Sep 30 06:18:17 np0005461738.novalocal systemd[1]: Started Session 3 of User zuul.
Sep 30 06:18:17 np0005461738.novalocal sshd-session[4095]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 06:18:18 np0005461738.novalocal sudo[4174]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpedgnsecuminhxxcenilyyvtmfawbmp ; OS_CLOUD=vexxhost /usr/bin/python3'
Sep 30 06:18:18 np0005461738.novalocal sudo[4174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:18:18 np0005461738.novalocal python3[4176]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 06:18:18 np0005461738.novalocal sudo[4174]: pam_unix(sudo:session): session closed for user root
Sep 30 06:18:18 np0005461738.novalocal sudo[4247]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cobaxvydbxukdmwpkbovkrwbenxanfpn ; OS_CLOUD=vexxhost /usr/bin/python3'
Sep 30 06:18:18 np0005461738.novalocal sudo[4247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:18:18 np0005461738.novalocal python3[4249]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759213098.0567343-309-202507826027928/source _original_basename=tmpulv4nqlp follow=False checksum=cbe16651e79ebfa3acab95520090aafe4ce400d0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:18:18 np0005461738.novalocal sudo[4247]: pam_unix(sudo:session): session closed for user root
Sep 30 06:18:22 np0005461738.novalocal sshd-session[4098]: Connection closed by 38.102.83.114 port 57448
Sep 30 06:18:22 np0005461738.novalocal sshd-session[4095]: pam_unix(sshd:session): session closed for user zuul
Sep 30 06:18:22 np0005461738.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Sep 30 06:18:22 np0005461738.novalocal systemd-logind[824]: Session 3 logged out. Waiting for processes to exit.
Sep 30 06:18:22 np0005461738.novalocal systemd-logind[824]: Removed session 3.
Sep 30 06:19:12 np0005461738.novalocal sshd[1008]: Timeout before authentication for connection from 171.80.14.101 to 38.102.83.22, pid = 4062
Sep 30 06:19:17 np0005461738.novalocal sshd-session[4276]: Received disconnect from 152.32.253.152 port 54056:11: Bye Bye [preauth]
Sep 30 06:19:17 np0005461738.novalocal sshd-session[4276]: Disconnected from authenticating user root 152.32.253.152 port 54056 [preauth]
Sep 30 06:20:17 np0005461738.novalocal sshd-session[4278]: Invalid user sara from 152.32.253.152 port 49368
Sep 30 06:20:17 np0005461738.novalocal sshd-session[4278]: Received disconnect from 152.32.253.152 port 49368:11: Bye Bye [preauth]
Sep 30 06:20:17 np0005461738.novalocal sshd-session[4278]: Disconnected from invalid user sara 152.32.253.152 port 49368 [preauth]
Sep 30 06:20:27 np0005461738.novalocal sshd-session[4280]: Received disconnect from 91.224.92.28 port 39500:11:  [preauth]
Sep 30 06:20:27 np0005461738.novalocal sshd-session[4280]: Disconnected from authenticating user root 91.224.92.28 port 39500 [preauth]
Sep 30 06:20:37 np0005461738.novalocal systemd[1096]: Created slice User Background Tasks Slice.
Sep 30 06:20:37 np0005461738.novalocal systemd[1096]: Starting Cleanup of User's Temporary Files and Directories...
Sep 30 06:20:37 np0005461738.novalocal systemd[1096]: Finished Cleanup of User's Temporary Files and Directories.
Sep 30 06:21:19 np0005461738.novalocal sshd-session[4285]: Invalid user kkadmin from 152.32.253.152 port 44684
Sep 30 06:21:20 np0005461738.novalocal sshd-session[4285]: Received disconnect from 152.32.253.152 port 44684:11: Bye Bye [preauth]
Sep 30 06:21:20 np0005461738.novalocal sshd-session[4285]: Disconnected from invalid user kkadmin 152.32.253.152 port 44684 [preauth]
Sep 30 06:22:24 np0005461738.novalocal sshd-session[4287]: Invalid user rain from 152.32.253.152 port 40002
Sep 30 06:22:24 np0005461738.novalocal sshd-session[4287]: Received disconnect from 152.32.253.152 port 40002:11: Bye Bye [preauth]
Sep 30 06:22:24 np0005461738.novalocal sshd-session[4287]: Disconnected from invalid user rain 152.32.253.152 port 40002 [preauth]
Sep 30 06:23:22 np0005461738.novalocal sshd-session[4290]: Accepted publickey for zuul from 38.102.83.114 port 36884 ssh2: RSA SHA256:C4HSx/cRfeCq/OvvwsL+6J5kLvcqUebsiQoFmFCWCHY
Sep 30 06:23:22 np0005461738.novalocal systemd-logind[824]: New session 4 of user zuul.
Sep 30 06:23:23 np0005461738.novalocal systemd[1]: Started Session 4 of User zuul.
Sep 30 06:23:23 np0005461738.novalocal sshd-session[4290]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 06:23:23 np0005461738.novalocal sudo[4317]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tszczyyikvspdkixcxayyguislnzdrul ; /usr/bin/python3'
Sep 30 06:23:23 np0005461738.novalocal sudo[4317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:23:23 np0005461738.novalocal python3[4319]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163e3b-3c83-b83c-8cff-000000001cf1-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:23:23 np0005461738.novalocal sudo[4317]: pam_unix(sudo:session): session closed for user root
Sep 30 06:23:23 np0005461738.novalocal sudo[4346]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etpijkwskgqbcfokgvmtnbqwpopkumne ; /usr/bin/python3'
Sep 30 06:23:23 np0005461738.novalocal sudo[4346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:23:24 np0005461738.novalocal python3[4348]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:23:24 np0005461738.novalocal sudo[4346]: pam_unix(sudo:session): session closed for user root
Sep 30 06:23:24 np0005461738.novalocal sudo[4372]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukuzyjefovrlvyklhaftoaetblzuqwgb ; /usr/bin/python3'
Sep 30 06:23:24 np0005461738.novalocal sudo[4372]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:23:24 np0005461738.novalocal python3[4374]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:23:24 np0005461738.novalocal sudo[4372]: pam_unix(sudo:session): session closed for user root
Sep 30 06:23:24 np0005461738.novalocal sudo[4398]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgguvqkcqzttqfwoikxecxololxyvlli ; /usr/bin/python3'
Sep 30 06:23:24 np0005461738.novalocal sudo[4398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:23:24 np0005461738.novalocal python3[4400]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:23:24 np0005461738.novalocal sudo[4398]: pam_unix(sudo:session): session closed for user root
Sep 30 06:23:24 np0005461738.novalocal sudo[4424]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bskbzzoseyseisffvjlqozxmvzalhbiz ; /usr/bin/python3'
Sep 30 06:23:24 np0005461738.novalocal sudo[4424]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:23:25 np0005461738.novalocal python3[4426]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:23:25 np0005461738.novalocal sudo[4424]: pam_unix(sudo:session): session closed for user root
Sep 30 06:23:25 np0005461738.novalocal sudo[4450]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfukscawgvquolccmhebazokrdxwdhug ; /usr/bin/python3'
Sep 30 06:23:25 np0005461738.novalocal sudo[4450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:23:25 np0005461738.novalocal python3[4452]: ansible-ansible.builtin.lineinfile Invoked with path=/etc/systemd/system.conf regexp=^#DefaultIOAccounting=no line=DefaultIOAccounting=yes state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:23:25 np0005461738.novalocal python3[4452]: ansible-ansible.builtin.lineinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Sep 30 06:23:25 np0005461738.novalocal sudo[4450]: pam_unix(sudo:session): session closed for user root
Sep 30 06:23:25 np0005461738.novalocal sudo[4476]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtqxzyhucbtnvimzribacswteczydymg ; /usr/bin/python3'
Sep 30 06:23:25 np0005461738.novalocal sudo[4476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:23:26 np0005461738.novalocal python3[4478]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 06:23:26 np0005461738.novalocal systemd[1]: Reloading.
Sep 30 06:23:26 np0005461738.novalocal systemd-rc-local-generator[4498]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:23:26 np0005461738.novalocal sudo[4476]: pam_unix(sudo:session): session closed for user root
Sep 30 06:23:27 np0005461738.novalocal sudo[4534]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgbxreznwucnvhiyxjuskcqlrntysmlu ; /usr/bin/python3'
Sep 30 06:23:27 np0005461738.novalocal sudo[4534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:23:28 np0005461738.novalocal python3[4536]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Sep 30 06:23:28 np0005461738.novalocal sudo[4534]: pam_unix(sudo:session): session closed for user root
Sep 30 06:23:28 np0005461738.novalocal sshd-session[4509]: Invalid user so from 152.32.253.152 port 35318
Sep 30 06:23:28 np0005461738.novalocal sudo[4560]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlcxdxdiavhyzdsghbbxhojdgcjtwyco ; /usr/bin/python3'
Sep 30 06:23:28 np0005461738.novalocal sudo[4560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:23:28 np0005461738.novalocal sshd-session[4509]: Received disconnect from 152.32.253.152 port 35318:11: Bye Bye [preauth]
Sep 30 06:23:28 np0005461738.novalocal sshd-session[4509]: Disconnected from invalid user so 152.32.253.152 port 35318 [preauth]
Sep 30 06:23:28 np0005461738.novalocal python3[4562]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:23:28 np0005461738.novalocal sudo[4560]: pam_unix(sudo:session): session closed for user root
Sep 30 06:23:28 np0005461738.novalocal sudo[4588]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-delldblqwvnproihxcmzjlchgdpipqfz ; /usr/bin/python3'
Sep 30 06:23:28 np0005461738.novalocal sudo[4588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:23:28 np0005461738.novalocal python3[4590]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:23:28 np0005461738.novalocal sudo[4588]: pam_unix(sudo:session): session closed for user root
Sep 30 06:23:28 np0005461738.novalocal sudo[4616]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmqvitcntepyvpbekkxtxkczkogaighr ; /usr/bin/python3'
Sep 30 06:23:28 np0005461738.novalocal sudo[4616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:23:29 np0005461738.novalocal python3[4618]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:23:29 np0005461738.novalocal sudo[4616]: pam_unix(sudo:session): session closed for user root
Sep 30 06:23:29 np0005461738.novalocal sudo[4644]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfmdqmninhvivrfxxfetgsuzmthzvlbg ; /usr/bin/python3'
Sep 30 06:23:29 np0005461738.novalocal sudo[4644]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:23:29 np0005461738.novalocal python3[4646]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:23:29 np0005461738.novalocal sudo[4644]: pam_unix(sudo:session): session closed for user root
Sep 30 06:23:29 np0005461738.novalocal python3[4673]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163e3b-3c83-b83c-8cff-000000001cf7-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:23:30 np0005461738.novalocal python3[4703]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 06:23:33 np0005461738.novalocal sshd-session[4293]: Connection closed by 38.102.83.114 port 36884
Sep 30 06:23:33 np0005461738.novalocal sshd-session[4290]: pam_unix(sshd:session): session closed for user zuul
Sep 30 06:23:33 np0005461738.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Sep 30 06:23:33 np0005461738.novalocal systemd[1]: session-4.scope: Consumed 4.061s CPU time.
Sep 30 06:23:33 np0005461738.novalocal systemd-logind[824]: Session 4 logged out. Waiting for processes to exit.
Sep 30 06:23:33 np0005461738.novalocal systemd-logind[824]: Removed session 4.
Sep 30 06:23:34 np0005461738.novalocal sshd-session[4709]: Accepted publickey for zuul from 38.102.83.114 port 40988 ssh2: RSA SHA256:C4HSx/cRfeCq/OvvwsL+6J5kLvcqUebsiQoFmFCWCHY
Sep 30 06:23:34 np0005461738.novalocal systemd-logind[824]: New session 5 of user zuul.
Sep 30 06:23:34 np0005461738.novalocal systemd[1]: Started Session 5 of User zuul.
Sep 30 06:23:34 np0005461738.novalocal sshd-session[4709]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 06:23:34 np0005461738.novalocal sudo[4736]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvzyyvciiafzkzofxndvnkzuiwggpjri ; /usr/bin/python3'
Sep 30 06:23:34 np0005461738.novalocal sudo[4736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:23:35 np0005461738.novalocal python3[4738]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Sep 30 06:23:48 np0005461738.novalocal kernel: SELinux:  Converting 364 SID table entries...
Sep 30 06:23:48 np0005461738.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 06:23:48 np0005461738.novalocal kernel: SELinux:  policy capability open_perms=1
Sep 30 06:23:48 np0005461738.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 06:23:48 np0005461738.novalocal kernel: SELinux:  policy capability always_check_network=0
Sep 30 06:23:48 np0005461738.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 06:23:48 np0005461738.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 06:23:48 np0005461738.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 06:23:57 np0005461738.novalocal kernel: SELinux:  Converting 364 SID table entries...
Sep 30 06:23:57 np0005461738.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 06:23:57 np0005461738.novalocal kernel: SELinux:  policy capability open_perms=1
Sep 30 06:23:57 np0005461738.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 06:23:57 np0005461738.novalocal kernel: SELinux:  policy capability always_check_network=0
Sep 30 06:23:57 np0005461738.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 06:23:57 np0005461738.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 06:23:57 np0005461738.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 06:24:05 np0005461738.novalocal kernel: SELinux:  Converting 364 SID table entries...
Sep 30 06:24:05 np0005461738.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 06:24:05 np0005461738.novalocal kernel: SELinux:  policy capability open_perms=1
Sep 30 06:24:05 np0005461738.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 06:24:05 np0005461738.novalocal kernel: SELinux:  policy capability always_check_network=0
Sep 30 06:24:05 np0005461738.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 06:24:05 np0005461738.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 06:24:05 np0005461738.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 06:24:07 np0005461738.novalocal setsebool[4798]: The virt_use_nfs policy boolean was changed to 1 by root
Sep 30 06:24:07 np0005461738.novalocal setsebool[4798]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Sep 30 06:24:16 np0005461738.novalocal sshd-session[4807]: Invalid user admin from 194.0.234.93 port 35590
Sep 30 06:24:16 np0005461738.novalocal sshd-session[4807]: Connection closed by invalid user admin 194.0.234.93 port 35590 [preauth]
Sep 30 06:24:17 np0005461738.novalocal kernel: SELinux:  Converting 367 SID table entries...
Sep 30 06:24:17 np0005461738.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 06:24:17 np0005461738.novalocal kernel: SELinux:  policy capability open_perms=1
Sep 30 06:24:17 np0005461738.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 06:24:17 np0005461738.novalocal kernel: SELinux:  policy capability always_check_network=0
Sep 30 06:24:17 np0005461738.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 06:24:17 np0005461738.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 06:24:17 np0005461738.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 06:24:29 np0005461738.novalocal sshd-session[5515]: Invalid user superadmin from 152.32.253.152 port 58864
Sep 30 06:24:29 np0005461738.novalocal sshd-session[5515]: Received disconnect from 152.32.253.152 port 58864:11: Bye Bye [preauth]
Sep 30 06:24:29 np0005461738.novalocal sshd-session[5515]: Disconnected from invalid user superadmin 152.32.253.152 port 58864 [preauth]
Sep 30 06:24:30 np0005461738.novalocal sshd-session[5517]: Connection closed by 118.145.73.187 port 60718
Sep 30 06:24:35 np0005461738.novalocal dbus-broker-launch[814]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Sep 30 06:24:35 np0005461738.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Sep 30 06:24:35 np0005461738.novalocal systemd[1]: Starting man-db-cache-update.service...
Sep 30 06:24:35 np0005461738.novalocal systemd[1]: Reloading.
Sep 30 06:24:35 np0005461738.novalocal systemd-rc-local-generator[5554]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:24:35 np0005461738.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Sep 30 06:24:36 np0005461738.novalocal systemd[1]: Starting PackageKit Daemon...
Sep 30 06:24:36 np0005461738.novalocal PackageKit[6118]: daemon start
Sep 30 06:24:36 np0005461738.novalocal systemd[1]: Starting Authorization Manager...
Sep 30 06:24:36 np0005461738.novalocal polkitd[6182]: Started polkitd version 0.117
Sep 30 06:24:37 np0005461738.novalocal polkitd[6182]: Loading rules from directory /etc/polkit-1/rules.d
Sep 30 06:24:37 np0005461738.novalocal polkitd[6182]: Loading rules from directory /usr/share/polkit-1/rules.d
Sep 30 06:24:37 np0005461738.novalocal polkitd[6182]: Finished loading, compiling and executing 3 rules
Sep 30 06:24:37 np0005461738.novalocal systemd[1]: Started Authorization Manager.
Sep 30 06:24:37 np0005461738.novalocal polkitd[6182]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Sep 30 06:24:37 np0005461738.novalocal systemd[1]: Started PackageKit Daemon.
Sep 30 06:24:37 np0005461738.novalocal sudo[4736]: pam_unix(sudo:session): session closed for user root
Sep 30 06:24:40 np0005461738.novalocal python3[8520]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                                       _uses_shell=True zuul_log_id=fa163e3b-3c83-7e44-538b-00000000000b-1-compute0 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:24:41 np0005461738.novalocal kernel: evm: overlay not supported
Sep 30 06:24:41 np0005461738.novalocal systemd[1096]: Starting D-Bus User Message Bus...
Sep 30 06:24:41 np0005461738.novalocal dbus-broker-launch[9183]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Sep 30 06:24:41 np0005461738.novalocal dbus-broker-launch[9183]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Sep 30 06:24:41 np0005461738.novalocal systemd[1096]: Started D-Bus User Message Bus.
Sep 30 06:24:41 np0005461738.novalocal dbus-broker-lau[9183]: Ready
Sep 30 06:24:41 np0005461738.novalocal systemd[1096]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Sep 30 06:24:41 np0005461738.novalocal systemd[1096]: Created slice Slice /user.
Sep 30 06:24:41 np0005461738.novalocal systemd[1096]: podman-9070.scope: unit configures an IP firewall, but not running as root.
Sep 30 06:24:41 np0005461738.novalocal systemd[1096]: (This warning is only shown for the first unit using IP firewalling.)
Sep 30 06:24:41 np0005461738.novalocal systemd[1096]: Started podman-9070.scope.
Sep 30 06:24:41 np0005461738.novalocal systemd[1096]: Started podman-pause-50ab84f2.scope.
Sep 30 06:24:42 np0005461738.novalocal sudo[9640]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lpfohnfgynjccklxfrifyctcqcoumspc ; /usr/bin/python3'
Sep 30 06:24:42 np0005461738.novalocal sudo[9640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:24:42 np0005461738.novalocal python3[9661]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]
                                                      location = "38.102.83.30:5001"
                                                      insecure = true path=/etc/containers/registries.conf block=[[registry]]
                                                      location = "38.102.83.30:5001"
                                                      insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:24:42 np0005461738.novalocal sudo[9640]: pam_unix(sudo:session): session closed for user root
Sep 30 06:24:42 np0005461738.novalocal sshd-session[4712]: Connection closed by 38.102.83.114 port 40988
Sep 30 06:24:42 np0005461738.novalocal sshd-session[4709]: pam_unix(sshd:session): session closed for user zuul
Sep 30 06:24:42 np0005461738.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Sep 30 06:24:42 np0005461738.novalocal systemd[1]: session-5.scope: Consumed 58.745s CPU time.
Sep 30 06:24:42 np0005461738.novalocal systemd-logind[824]: Session 5 logged out. Waiting for processes to exit.
Sep 30 06:24:42 np0005461738.novalocal systemd-logind[824]: Removed session 5.
Sep 30 06:24:44 np0005461738.novalocal sshd-session[10674]: Accepted publickey for zuul from 38.102.83.114 port 60772 ssh2: RSA SHA256:C4HSx/cRfeCq/OvvwsL+6J5kLvcqUebsiQoFmFCWCHY
Sep 30 06:24:44 np0005461738.novalocal systemd-logind[824]: New session 6 of user zuul.
Sep 30 06:24:44 np0005461738.novalocal systemd[1]: Started Session 6 of User zuul.
Sep 30 06:24:44 np0005461738.novalocal sshd-session[10674]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 06:24:44 np0005461738.novalocal python3[10701]: ansible-ansible.builtin.stat Invoked with path=/var/lib/zuul/builds/7f47d7c16caa45d2974962c59889ca32/untrusted/project_0/github.com/openstack-k8s-operators/ci-framework/ci/playbooks/group_vars/all.yml follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 06:24:45 np0005461738.novalocal sshd-session[10677]: Connection closed by 38.102.83.114 port 60772
Sep 30 06:24:45 np0005461738.novalocal sshd-session[10674]: pam_unix(sshd:session): session closed for user zuul
Sep 30 06:24:45 np0005461738.novalocal systemd[1]: session-6.scope: Deactivated successfully.
Sep 30 06:24:45 np0005461738.novalocal systemd-logind[824]: Session 6 logged out. Waiting for processes to exit.
Sep 30 06:24:45 np0005461738.novalocal systemd-logind[824]: Removed session 6.
Sep 30 06:25:04 np0005461738.novalocal sshd-session[16697]: Connection closed by 38.102.83.45 port 46834 [preauth]
Sep 30 06:25:04 np0005461738.novalocal sshd-session[16700]: Unable to negotiate with 38.102.83.45 port 46846: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Sep 30 06:25:04 np0005461738.novalocal sshd-session[16698]: Unable to negotiate with 38.102.83.45 port 46866: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Sep 30 06:25:04 np0005461738.novalocal sshd-session[16703]: Connection closed by 38.102.83.45 port 46828 [preauth]
Sep 30 06:25:04 np0005461738.novalocal sshd-session[16705]: Unable to negotiate with 38.102.83.45 port 46860: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Sep 30 06:25:09 np0005461738.novalocal sshd-session[18060]: Accepted publickey for zuul from 38.102.83.114 port 50740 ssh2: RSA SHA256:C4HSx/cRfeCq/OvvwsL+6J5kLvcqUebsiQoFmFCWCHY
Sep 30 06:25:09 np0005461738.novalocal systemd-logind[824]: New session 7 of user zuul.
Sep 30 06:25:09 np0005461738.novalocal systemd[1]: Started Session 7 of User zuul.
Sep 30 06:25:09 np0005461738.novalocal sshd-session[18060]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 06:25:09 np0005461738.novalocal python3[18150]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBP7sfUj54kR+85fTqDEJmFIxfTGS0OkX3RS/w+nW9w8qjmYjLwpjvaAawRuycQs6kVvnJPbd62ueOTs0G8QWEho= zuul@np0005461737.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 06:25:09 np0005461738.novalocal sudo[18298]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpszubzjwhcqnkktbjynxjjkbarwrlid ; /usr/bin/python3'
Sep 30 06:25:09 np0005461738.novalocal sudo[18298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:25:09 np0005461738.novalocal python3[18308]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBP7sfUj54kR+85fTqDEJmFIxfTGS0OkX3RS/w+nW9w8qjmYjLwpjvaAawRuycQs6kVvnJPbd62ueOTs0G8QWEho= zuul@np0005461737.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 06:25:09 np0005461738.novalocal sudo[18298]: pam_unix(sudo:session): session closed for user root
Sep 30 06:25:10 np0005461738.novalocal sudo[18587]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmqwhbxejvnucbroddkbrhkflrvkgzbq ; /usr/bin/python3'
Sep 30 06:25:10 np0005461738.novalocal sudo[18587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:25:10 np0005461738.novalocal python3[18598]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005461738.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Sep 30 06:25:10 np0005461738.novalocal useradd[18657]: new group: name=cloud-admin, GID=1002
Sep 30 06:25:10 np0005461738.novalocal useradd[18657]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Sep 30 06:25:10 np0005461738.novalocal sudo[18587]: pam_unix(sudo:session): session closed for user root
Sep 30 06:25:11 np0005461738.novalocal sudo[18761]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebbmrgjphxyqqcccivwjjwwpnvhhnnfb ; /usr/bin/python3'
Sep 30 06:25:11 np0005461738.novalocal sudo[18761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:25:11 np0005461738.novalocal python3[18774]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBP7sfUj54kR+85fTqDEJmFIxfTGS0OkX3RS/w+nW9w8qjmYjLwpjvaAawRuycQs6kVvnJPbd62ueOTs0G8QWEho= zuul@np0005461737.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Sep 30 06:25:11 np0005461738.novalocal sudo[18761]: pam_unix(sudo:session): session closed for user root
Sep 30 06:25:11 np0005461738.novalocal sudo[18992]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owriezzorxundbnzgiuyckftppwxxkll ; /usr/bin/python3'
Sep 30 06:25:11 np0005461738.novalocal sudo[18992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:25:11 np0005461738.novalocal python3[19001]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 06:25:11 np0005461738.novalocal sudo[18992]: pam_unix(sudo:session): session closed for user root
Sep 30 06:25:12 np0005461738.novalocal sudo[19185]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upxbtutyhhowvkauysshfjjivajgjlud ; /usr/bin/python3'
Sep 30 06:25:12 np0005461738.novalocal sudo[19185]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:25:12 np0005461738.novalocal python3[19193]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1759213511.4806385-151-171491658532142/source _original_basename=tmpxxogeezm follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:25:12 np0005461738.novalocal sudo[19185]: pam_unix(sudo:session): session closed for user root
Sep 30 06:25:12 np0005461738.novalocal sudo[19466]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irlvngycmmdxggzorjxcckbfedshzfou ; /usr/bin/python3'
Sep 30 06:25:12 np0005461738.novalocal sudo[19466]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:25:13 np0005461738.novalocal python3[19475]: ansible-ansible.builtin.hostname Invoked with name=compute-0 use=systemd
Sep 30 06:25:13 np0005461738.novalocal systemd[1]: Starting Hostname Service...
Sep 30 06:25:13 np0005461738.novalocal systemd[1]: Started Hostname Service.
Sep 30 06:25:13 np0005461738.novalocal systemd-hostnamed[19567]: Changed pretty hostname to 'compute-0'
Sep 30 06:25:13 compute-0 systemd-hostnamed[19567]: Hostname set to <compute-0> (static)
Sep 30 06:25:13 compute-0 NetworkManager[3981]: <info>  [1759213513.3308] hostname: static hostname changed from "np0005461738.novalocal" to "compute-0"
Sep 30 06:25:13 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Sep 30 06:25:13 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Sep 30 06:25:13 compute-0 sudo[19466]: pam_unix(sudo:session): session closed for user root
Sep 30 06:25:13 compute-0 sshd-session[18098]: Connection closed by 38.102.83.114 port 50740
Sep 30 06:25:13 compute-0 sshd-session[18060]: pam_unix(sshd:session): session closed for user zuul
Sep 30 06:25:13 compute-0 systemd[1]: session-7.scope: Deactivated successfully.
Sep 30 06:25:13 compute-0 systemd[1]: session-7.scope: Consumed 2.689s CPU time.
Sep 30 06:25:13 compute-0 systemd-logind[824]: Session 7 logged out. Waiting for processes to exit.
Sep 30 06:25:13 compute-0 systemd-logind[824]: Removed session 7.
Sep 30 06:25:23 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Sep 30 06:25:29 compute-0 sshd-session[24020]: Invalid user user from 152.32.253.152 port 54176
Sep 30 06:25:30 compute-0 sshd-session[24020]: Received disconnect from 152.32.253.152 port 54176:11: Bye Bye [preauth]
Sep 30 06:25:30 compute-0 sshd-session[24020]: Disconnected from invalid user user 152.32.253.152 port 54176 [preauth]
Sep 30 06:25:37 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Sep 30 06:25:37 compute-0 systemd[1]: Finished man-db-cache-update.service.
Sep 30 06:25:37 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1min 16.984s CPU time.
Sep 30 06:25:37 compute-0 systemd[1]: run-rf37610a269784eadab5f8dd0823604ab.service: Deactivated successfully.
Sep 30 06:25:43 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Sep 30 06:26:20 compute-0 sshd-session[26618]: Received disconnect from 193.46.255.7 port 12272:11:  [preauth]
Sep 30 06:26:20 compute-0 sshd-session[26618]: Disconnected from authenticating user root 193.46.255.7 port 12272 [preauth]
Sep 30 06:26:30 compute-0 sshd-session[26622]: Received disconnect from 152.32.253.152 port 49494:11: Bye Bye [preauth]
Sep 30 06:26:30 compute-0 sshd-session[26622]: Disconnected from authenticating user root 152.32.253.152 port 49494 [preauth]
Sep 30 06:27:31 compute-0 sshd-session[26626]: Invalid user testuser from 152.32.253.152 port 44810
Sep 30 06:27:31 compute-0 sshd-session[26626]: Received disconnect from 152.32.253.152 port 44810:11: Bye Bye [preauth]
Sep 30 06:27:31 compute-0 sshd-session[26626]: Disconnected from invalid user testuser 152.32.253.152 port 44810 [preauth]
Sep 30 06:28:36 compute-0 sshd-session[26629]: Invalid user testuser from 152.32.253.152 port 40126
Sep 30 06:28:36 compute-0 sshd-session[26629]: Received disconnect from 152.32.253.152 port 40126:11: Bye Bye [preauth]
Sep 30 06:28:36 compute-0 sshd-session[26629]: Disconnected from invalid user testuser 152.32.253.152 port 40126 [preauth]
Sep 30 06:28:39 compute-0 sshd-session[26631]: Accepted publickey for zuul from 38.102.83.45 port 38296 ssh2: RSA SHA256:C4HSx/cRfeCq/OvvwsL+6J5kLvcqUebsiQoFmFCWCHY
Sep 30 06:28:40 compute-0 systemd-logind[824]: New session 8 of user zuul.
Sep 30 06:28:40 compute-0 systemd[1]: Started Session 8 of User zuul.
Sep 30 06:28:40 compute-0 sshd-session[26631]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 06:28:40 compute-0 python3[26707]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 06:28:43 compute-0 sudo[26821]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ysscvvaaokfhhydswpcneosnfyixtypv ; /usr/bin/python3'
Sep 30 06:28:43 compute-0 sudo[26821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:28:43 compute-0 python3[26823]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 06:28:43 compute-0 sudo[26821]: pam_unix(sudo:session): session closed for user root
Sep 30 06:28:43 compute-0 sudo[26894]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzyeaousrcjoiedokhvkfatqmbfgbweq ; /usr/bin/python3'
Sep 30 06:28:43 compute-0 sudo[26894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:28:43 compute-0 python3[26896]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759213723.0101936-30468-256896818034876/source mode=0755 _original_basename=delorean.repo follow=False checksum=63afb3d718e99c27c8a803d158707ec5e03078ec backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:28:43 compute-0 sudo[26894]: pam_unix(sudo:session): session closed for user root
Sep 30 06:28:43 compute-0 sudo[26920]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvpcxwhkdaywsjflwiofgmevdqjngryc ; /usr/bin/python3'
Sep 30 06:28:43 compute-0 sudo[26920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:28:44 compute-0 python3[26922]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-master-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 06:28:44 compute-0 sudo[26920]: pam_unix(sudo:session): session closed for user root
Sep 30 06:28:44 compute-0 sudo[26993]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsrngwmujzkrgovxtpnrlzbbmkmmuujd ; /usr/bin/python3'
Sep 30 06:28:44 compute-0 sudo[26993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:28:44 compute-0 python3[26995]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759213723.0101936-30468-256896818034876/source mode=0755 _original_basename=delorean-master-testing.repo follow=False checksum=c22157e85d05af7ffbafa054f80958446d397a41 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:28:44 compute-0 sudo[26993]: pam_unix(sudo:session): session closed for user root
Sep 30 06:28:44 compute-0 sudo[27019]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqmzujoluargeupusekbrnvvuyhfoujw ; /usr/bin/python3'
Sep 30 06:28:44 compute-0 sudo[27019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:28:44 compute-0 python3[27021]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 06:28:44 compute-0 sudo[27019]: pam_unix(sudo:session): session closed for user root
Sep 30 06:28:45 compute-0 sudo[27092]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eorpjfschntldkfgnpreggftvlbddyyz ; /usr/bin/python3'
Sep 30 06:28:45 compute-0 sudo[27092]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:28:45 compute-0 python3[27094]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759213723.0101936-30468-256896818034876/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:28:45 compute-0 sudo[27092]: pam_unix(sudo:session): session closed for user root
Sep 30 06:28:45 compute-0 sudo[27118]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgchnjnybjhpuyufqmkcdsnqiaumjzor ; /usr/bin/python3'
Sep 30 06:28:45 compute-0 sudo[27118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:28:45 compute-0 python3[27120]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 06:28:45 compute-0 sudo[27118]: pam_unix(sudo:session): session closed for user root
Sep 30 06:28:45 compute-0 sudo[27191]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzacuwxuvmjnjzbghwgepmgqampemtsl ; /usr/bin/python3'
Sep 30 06:28:45 compute-0 sudo[27191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:28:46 compute-0 python3[27193]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759213723.0101936-30468-256896818034876/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:28:46 compute-0 sudo[27191]: pam_unix(sudo:session): session closed for user root
Sep 30 06:28:46 compute-0 sudo[27217]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-opojtssdjfwzlcjbfnyfuvtyxhqurbsx ; /usr/bin/python3'
Sep 30 06:28:46 compute-0 sudo[27217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:28:46 compute-0 python3[27219]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 06:28:46 compute-0 sudo[27217]: pam_unix(sudo:session): session closed for user root
Sep 30 06:28:46 compute-0 sudo[27290]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zdbiiteiohkvohdhsygxsnfczwujtndu ; /usr/bin/python3'
Sep 30 06:28:46 compute-0 sudo[27290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:28:46 compute-0 python3[27292]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759213723.0101936-30468-256896818034876/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:28:46 compute-0 sudo[27290]: pam_unix(sudo:session): session closed for user root
Sep 30 06:28:46 compute-0 sudo[27316]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cblmaydrwggagshqufidfsjtwkokkjdv ; /usr/bin/python3'
Sep 30 06:28:46 compute-0 sudo[27316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:28:47 compute-0 python3[27318]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 06:28:47 compute-0 sudo[27316]: pam_unix(sudo:session): session closed for user root
Sep 30 06:28:47 compute-0 sudo[27389]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwjboxzfncseihvbreohlxicpcxkmdki ; /usr/bin/python3'
Sep 30 06:28:47 compute-0 sudo[27389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:28:47 compute-0 python3[27391]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759213723.0101936-30468-256896818034876/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:28:47 compute-0 sudo[27389]: pam_unix(sudo:session): session closed for user root
Sep 30 06:28:47 compute-0 sudo[27415]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewtwvnsdegdokbphdgpfrscgxozcbdwd ; /usr/bin/python3'
Sep 30 06:28:47 compute-0 sudo[27415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:28:47 compute-0 python3[27417]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 06:28:47 compute-0 sudo[27415]: pam_unix(sudo:session): session closed for user root
Sep 30 06:28:48 compute-0 sudo[27488]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knlrdlxbmxvqojchvytendhtribnqfwr ; /usr/bin/python3'
Sep 30 06:28:48 compute-0 sudo[27488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:28:48 compute-0 python3[27490]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759213723.0101936-30468-256896818034876/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=1d4337ff1f040a6736604012d409c55c328802cd backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:28:48 compute-0 sudo[27488]: pam_unix(sudo:session): session closed for user root
Sep 30 06:28:48 compute-0 sudo[27514]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbyglgikrznkliclhsxqhdfgikilotzn ; /usr/bin/python3'
Sep 30 06:28:48 compute-0 sudo[27514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:28:48 compute-0 python3[27516]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/gating.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Sep 30 06:28:48 compute-0 sudo[27514]: pam_unix(sudo:session): session closed for user root
Sep 30 06:28:48 compute-0 sudo[27587]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uaevcytifvfbbamfipfrkeehoggmzkoz ; /usr/bin/python3'
Sep 30 06:28:48 compute-0 sudo[27587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:28:48 compute-0 python3[27589]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759213723.0101936-30468-256896818034876/source mode=0755 _original_basename=gating.repo follow=False checksum=c5b62a3bac5198fa0e7af5b3084c82dfdf674f98 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:28:48 compute-0 sudo[27587]: pam_unix(sudo:session): session closed for user root
Sep 30 06:28:51 compute-0 sshd-session[27614]: Connection closed by 192.168.122.11 port 34894 [preauth]
Sep 30 06:28:51 compute-0 sshd-session[27616]: Connection closed by 192.168.122.11 port 34906 [preauth]
Sep 30 06:28:51 compute-0 sshd-session[27615]: Unable to negotiate with 192.168.122.11 port 34924: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Sep 30 06:28:51 compute-0 sshd-session[27617]: Unable to negotiate with 192.168.122.11 port 34916: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Sep 30 06:28:51 compute-0 sshd-session[27619]: Unable to negotiate with 192.168.122.11 port 34920: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Sep 30 06:29:42 compute-0 PackageKit[6118]: daemon quit
Sep 30 06:29:42 compute-0 systemd[1]: packagekit.service: Deactivated successfully.
Sep 30 06:29:45 compute-0 sshd-session[27624]: Invalid user dima from 152.32.253.152 port 35448
Sep 30 06:29:45 compute-0 sshd-session[27624]: Received disconnect from 152.32.253.152 port 35448:11: Bye Bye [preauth]
Sep 30 06:29:45 compute-0 sshd-session[27624]: Disconnected from invalid user dima 152.32.253.152 port 35448 [preauth]
Sep 30 06:29:54 compute-0 python3[27650]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:30:52 compute-0 sshd-session[27652]: Invalid user user from 152.32.253.152 port 59000
Sep 30 06:30:52 compute-0 sshd-session[27652]: Received disconnect from 152.32.253.152 port 59000:11: Bye Bye [preauth]
Sep 30 06:30:52 compute-0 sshd-session[27652]: Disconnected from invalid user user 152.32.253.152 port 59000 [preauth]
Sep 30 06:31:54 compute-0 sshd-session[27654]: Received disconnect from 91.224.92.32 port 31576:11:  [preauth]
Sep 30 06:31:54 compute-0 sshd-session[27654]: Disconnected from authenticating user root 91.224.92.32 port 31576 [preauth]
Sep 30 06:31:54 compute-0 sshd-session[27656]: Invalid user carol from 152.32.253.152 port 54314
Sep 30 06:31:54 compute-0 sshd-session[27656]: Received disconnect from 152.32.253.152 port 54314:11: Bye Bye [preauth]
Sep 30 06:31:54 compute-0 sshd-session[27656]: Disconnected from invalid user carol 152.32.253.152 port 54314 [preauth]
Sep 30 06:32:58 compute-0 sshd-session[27658]: Invalid user lander from 152.32.253.152 port 49630
Sep 30 06:32:58 compute-0 sshd-session[27658]: Received disconnect from 152.32.253.152 port 49630:11: Bye Bye [preauth]
Sep 30 06:32:58 compute-0 sshd-session[27658]: Disconnected from invalid user lander 152.32.253.152 port 49630 [preauth]
Sep 30 06:33:50 compute-0 sshd-session[27661]: Invalid user admin from 194.0.234.19 port 60850
Sep 30 06:33:50 compute-0 sshd-session[27661]: Connection closed by invalid user admin 194.0.234.19 port 60850 [preauth]
Sep 30 06:34:01 compute-0 sshd-session[27663]: Invalid user ssm from 152.32.253.152 port 44944
Sep 30 06:34:01 compute-0 sshd-session[27663]: Received disconnect from 152.32.253.152 port 44944:11: Bye Bye [preauth]
Sep 30 06:34:01 compute-0 sshd-session[27663]: Disconnected from invalid user ssm 152.32.253.152 port 44944 [preauth]
Sep 30 06:34:54 compute-0 sshd-session[26634]: Received disconnect from 38.102.83.45 port 38296:11: disconnected by user
Sep 30 06:34:54 compute-0 sshd-session[26634]: Disconnected from user zuul 38.102.83.45 port 38296
Sep 30 06:34:54 compute-0 sshd-session[26631]: pam_unix(sshd:session): session closed for user zuul
Sep 30 06:34:54 compute-0 systemd[1]: session-8.scope: Deactivated successfully.
Sep 30 06:34:54 compute-0 systemd[1]: session-8.scope: Consumed 6.819s CPU time.
Sep 30 06:34:54 compute-0 systemd-logind[824]: Session 8 logged out. Waiting for processes to exit.
Sep 30 06:34:54 compute-0 systemd-logind[824]: Removed session 8.
Sep 30 06:35:06 compute-0 sshd-session[27665]: Received disconnect from 152.32.253.152 port 40268:11: Bye Bye [preauth]
Sep 30 06:35:06 compute-0 sshd-session[27665]: Disconnected from authenticating user root 152.32.253.152 port 40268 [preauth]
Sep 30 06:36:10 compute-0 sshd-session[27668]: Received disconnect from 152.32.253.152 port 35586:11: Bye Bye [preauth]
Sep 30 06:36:10 compute-0 sshd-session[27668]: Disconnected from authenticating user root 152.32.253.152 port 35586 [preauth]
Sep 30 06:37:13 compute-0 sshd-session[27670]: Received disconnect from 152.32.253.152 port 59138:11: Bye Bye [preauth]
Sep 30 06:37:13 compute-0 sshd-session[27670]: Disconnected from authenticating user root 152.32.253.152 port 59138 [preauth]
Sep 30 06:37:40 compute-0 sshd-session[27672]: Received disconnect from 91.224.92.79 port 23488:11:  [preauth]
Sep 30 06:37:40 compute-0 sshd-session[27672]: Disconnected from authenticating user root 91.224.92.79 port 23488 [preauth]
Sep 30 06:38:15 compute-0 sshd-session[27674]: Received disconnect from 152.32.253.152 port 54456:11: Bye Bye [preauth]
Sep 30 06:38:15 compute-0 sshd-session[27674]: Disconnected from authenticating user root 152.32.253.152 port 54456 [preauth]
Sep 30 06:39:16 compute-0 sshd-session[27676]: Received disconnect from 152.32.253.152 port 49776:11: Bye Bye [preauth]
Sep 30 06:39:16 compute-0 sshd-session[27676]: Disconnected from authenticating user root 152.32.253.152 port 49776 [preauth]
Sep 30 06:39:29 compute-0 sshd-session[27678]: Invalid user admin from 185.156.73.233 port 26350
Sep 30 06:39:29 compute-0 sshd-session[27678]: Connection closed by invalid user admin 185.156.73.233 port 26350 [preauth]
Sep 30 06:40:21 compute-0 sshd-session[27680]: Received disconnect from 152.32.253.152 port 45094:11: Bye Bye [preauth]
Sep 30 06:40:21 compute-0 sshd-session[27680]: Disconnected from authenticating user root 152.32.253.152 port 45094 [preauth]
Sep 30 06:41:27 compute-0 sshd-session[27684]: Invalid user ftpadmin from 152.32.253.152 port 40414
Sep 30 06:41:27 compute-0 sshd-session[27684]: Received disconnect from 152.32.253.152 port 40414:11: Bye Bye [preauth]
Sep 30 06:41:27 compute-0 sshd-session[27684]: Disconnected from invalid user ftpadmin 152.32.253.152 port 40414 [preauth]
Sep 30 06:41:42 compute-0 sshd-session[27687]: Accepted publickey for zuul from 192.168.122.30 port 50668 ssh2: ECDSA SHA256:VgXY+3KEFg6ByVjpOVk/qpSKqXtLqTtx1W0gQMfs9wE
Sep 30 06:41:42 compute-0 systemd-logind[824]: New session 9 of user zuul.
Sep 30 06:41:42 compute-0 systemd[1]: Started Session 9 of User zuul.
Sep 30 06:41:42 compute-0 sshd-session[27687]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 06:41:43 compute-0 python3.9[27840]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 06:41:44 compute-0 sudo[28019]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cginhzurghfoolsetmaxsnvabnpxdhmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214504.2115643-44-12340980430839/AnsiballZ_command.py'
Sep 30 06:41:44 compute-0 sudo[28019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:41:44 compute-0 python3.9[28021]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:41:52 compute-0 sudo[28019]: pam_unix(sudo:session): session closed for user root
Sep 30 06:41:53 compute-0 sshd-session[27690]: Connection closed by 192.168.122.30 port 50668
Sep 30 06:41:53 compute-0 sshd-session[27687]: pam_unix(sshd:session): session closed for user zuul
Sep 30 06:41:53 compute-0 systemd[1]: session-9.scope: Deactivated successfully.
Sep 30 06:41:53 compute-0 systemd[1]: session-9.scope: Consumed 8.148s CPU time.
Sep 30 06:41:53 compute-0 systemd-logind[824]: Session 9 logged out. Waiting for processes to exit.
Sep 30 06:41:53 compute-0 systemd-logind[824]: Removed session 9.
Sep 30 06:41:58 compute-0 sshd-session[28078]: Accepted publickey for zuul from 192.168.122.30 port 47148 ssh2: ECDSA SHA256:VgXY+3KEFg6ByVjpOVk/qpSKqXtLqTtx1W0gQMfs9wE
Sep 30 06:41:58 compute-0 systemd-logind[824]: New session 10 of user zuul.
Sep 30 06:41:58 compute-0 systemd[1]: Started Session 10 of User zuul.
Sep 30 06:41:58 compute-0 sshd-session[28078]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 06:41:59 compute-0 python3.9[28231]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 06:41:59 compute-0 sshd-session[28081]: Connection closed by 192.168.122.30 port 47148
Sep 30 06:41:59 compute-0 sshd-session[28078]: pam_unix(sshd:session): session closed for user zuul
Sep 30 06:41:59 compute-0 systemd-logind[824]: Session 10 logged out. Waiting for processes to exit.
Sep 30 06:41:59 compute-0 systemd[1]: session-10.scope: Deactivated successfully.
Sep 30 06:41:59 compute-0 systemd-logind[824]: Removed session 10.
Sep 30 06:42:15 compute-0 sshd-session[28259]: Accepted publickey for zuul from 192.168.122.30 port 50004 ssh2: ECDSA SHA256:VgXY+3KEFg6ByVjpOVk/qpSKqXtLqTtx1W0gQMfs9wE
Sep 30 06:42:15 compute-0 systemd-logind[824]: New session 11 of user zuul.
Sep 30 06:42:15 compute-0 systemd[1]: Started Session 11 of User zuul.
Sep 30 06:42:15 compute-0 sshd-session[28259]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 06:42:16 compute-0 python3.9[28412]: ansible-ansible.legacy.ping Invoked with data=pong
Sep 30 06:42:17 compute-0 python3.9[28586]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 06:42:18 compute-0 sudo[28736]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omlxtykgaikaqergwppjifkvrscvwqnw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214537.9910862-69-123927813715823/AnsiballZ_command.py'
Sep 30 06:42:18 compute-0 sudo[28736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:42:18 compute-0 python3.9[28738]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:42:18 compute-0 sudo[28736]: pam_unix(sudo:session): session closed for user root
Sep 30 06:42:19 compute-0 sudo[28889]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbwmswqwxqckbisweqqheyioykecqzcs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214539.2400453-93-134931861285194/AnsiballZ_stat.py'
Sep 30 06:42:19 compute-0 sudo[28889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:42:19 compute-0 python3.9[28891]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 06:42:19 compute-0 sudo[28889]: pam_unix(sudo:session): session closed for user root
Sep 30 06:42:20 compute-0 sudo[29041]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ulpwwvsprzlesipdijqjcaktxagudakj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214540.1406953-109-218980160232320/AnsiballZ_file.py'
Sep 30 06:42:20 compute-0 sudo[29041]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:42:20 compute-0 python3.9[29043]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:42:20 compute-0 sudo[29041]: pam_unix(sudo:session): session closed for user root
Sep 30 06:42:21 compute-0 sudo[29193]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffnsxyzzqdwrxdgaoicrassgkjlunxkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214541.125384-125-47542312401862/AnsiballZ_stat.py'
Sep 30 06:42:21 compute-0 sudo[29193]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:42:21 compute-0 python3.9[29195]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:42:21 compute-0 sudo[29193]: pam_unix(sudo:session): session closed for user root
Sep 30 06:42:22 compute-0 sudo[29316]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grvkgwzvjrztldtwciakqnnkegudaxzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214541.125384-125-47542312401862/AnsiballZ_copy.py'
Sep 30 06:42:22 compute-0 sudo[29316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:42:22 compute-0 python3.9[29318]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1759214541.125384-125-47542312401862/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:42:22 compute-0 sudo[29316]: pam_unix(sudo:session): session closed for user root
Sep 30 06:42:23 compute-0 sudo[29468]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsmyoavznyvodppvljkgjltpkepwglsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214542.718505-155-24015224787109/AnsiballZ_setup.py'
Sep 30 06:42:23 compute-0 sudo[29468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:42:23 compute-0 python3.9[29470]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 06:42:23 compute-0 sudo[29468]: pam_unix(sudo:session): session closed for user root
Sep 30 06:42:24 compute-0 sudo[29624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sflgbsojadtjfuybvrylypjsxurwojvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214543.8220577-171-183169075349340/AnsiballZ_file.py'
Sep 30 06:42:24 compute-0 sudo[29624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:42:24 compute-0 python3.9[29626]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:42:24 compute-0 sudo[29624]: pam_unix(sudo:session): session closed for user root
Sep 30 06:42:25 compute-0 python3.9[29776]: ansible-ansible.builtin.service_facts Invoked
Sep 30 06:42:31 compute-0 python3.9[30031]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:42:32 compute-0 python3.9[30181]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 06:42:33 compute-0 python3.9[30335]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 06:42:34 compute-0 sudo[30491]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrdimumgneddfianagjduupxcbuqtkfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214554.2737098-267-189995585770745/AnsiballZ_setup.py'
Sep 30 06:42:34 compute-0 sudo[30491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:42:34 compute-0 python3.9[30493]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 06:42:35 compute-0 sudo[30491]: pam_unix(sudo:session): session closed for user root
Sep 30 06:42:35 compute-0 sudo[30577]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atpadyvdgapyornuyrkydqfjqbrjuhxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214554.2737098-267-189995585770745/AnsiballZ_dnf.py'
Sep 30 06:42:35 compute-0 sudo[30577]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:42:35 compute-0 python3.9[30579]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 06:42:35 compute-0 sshd-session[30494]: Invalid user eva from 152.32.253.152 port 35734
Sep 30 06:42:36 compute-0 sshd-session[30494]: Received disconnect from 152.32.253.152 port 35734:11: Bye Bye [preauth]
Sep 30 06:42:36 compute-0 sshd-session[30494]: Disconnected from invalid user eva 152.32.253.152 port 35734 [preauth]
Sep 30 06:43:06 compute-0 sshd-session[30724]: Received disconnect from 91.224.92.108 port 64330:11:  [preauth]
Sep 30 06:43:06 compute-0 sshd-session[30724]: Disconnected from authenticating user root 91.224.92.108 port 64330 [preauth]
Sep 30 06:43:16 compute-0 systemd[1]: Reloading.
Sep 30 06:43:16 compute-0 systemd-rc-local-generator[30779]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:43:16 compute-0 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Sep 30 06:43:16 compute-0 systemd[1]: Reloading.
Sep 30 06:43:16 compute-0 systemd-rc-local-generator[30820]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:43:16 compute-0 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Sep 30 06:43:16 compute-0 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Sep 30 06:43:16 compute-0 systemd[1]: Reloading.
Sep 30 06:43:17 compute-0 systemd-rc-local-generator[30853]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:43:17 compute-0 systemd[1]: Listening on LVM2 poll daemon socket.
Sep 30 06:43:17 compute-0 dbus-broker-launch[807]: Noticed file-system modification, trigger reload.
Sep 30 06:43:17 compute-0 dbus-broker-launch[807]: Noticed file-system modification, trigger reload.
Sep 30 06:43:17 compute-0 dbus-broker-launch[807]: Noticed file-system modification, trigger reload.
Sep 30 06:43:42 compute-0 sshd-session[30944]: Received disconnect from 152.32.253.152 port 59288:11: Bye Bye [preauth]
Sep 30 06:43:42 compute-0 sshd-session[30944]: Disconnected from authenticating user root 152.32.253.152 port 59288 [preauth]
Sep 30 06:44:27 compute-0 kernel: SELinux:  Converting 2713 SID table entries...
Sep 30 06:44:27 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 06:44:27 compute-0 kernel: SELinux:  policy capability open_perms=1
Sep 30 06:44:27 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 06:44:27 compute-0 kernel: SELinux:  policy capability always_check_network=0
Sep 30 06:44:27 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 06:44:27 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 06:44:27 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 06:44:27 compute-0 dbus-broker-launch[814]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Sep 30 06:44:27 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Sep 30 06:44:27 compute-0 systemd[1]: Starting man-db-cache-update.service...
Sep 30 06:44:27 compute-0 systemd[1]: Reloading.
Sep 30 06:44:27 compute-0 systemd-rc-local-generator[31188]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:44:27 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Sep 30 06:44:28 compute-0 systemd[1]: Starting PackageKit Daemon...
Sep 30 06:44:28 compute-0 PackageKit[31481]: daemon start
Sep 30 06:44:28 compute-0 systemd[1]: Started PackageKit Daemon.
Sep 30 06:44:28 compute-0 sudo[30577]: pam_unix(sudo:session): session closed for user root
Sep 30 06:44:28 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Sep 30 06:44:28 compute-0 systemd[1]: Finished man-db-cache-update.service.
Sep 30 06:44:28 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.311s CPU time.
Sep 30 06:44:28 compute-0 systemd[1]: run-r912a8d08e34d4ca6affe0bb07a8d9160.service: Deactivated successfully.
Sep 30 06:44:28 compute-0 sudo[32108]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvcjoxlfbrqasboizrotffifjivkfici ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214668.4565148-291-23025387370433/AnsiballZ_command.py'
Sep 30 06:44:28 compute-0 sudo[32108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:44:29 compute-0 python3.9[32110]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:44:30 compute-0 sudo[32108]: pam_unix(sudo:session): session closed for user root
Sep 30 06:44:31 compute-0 sudo[32389]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccwvyxgakmttuvrgynergvxpqtsferib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214670.3350263-307-219623611247/AnsiballZ_selinux.py'
Sep 30 06:44:31 compute-0 sudo[32389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:44:31 compute-0 python3.9[32391]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Sep 30 06:44:31 compute-0 sudo[32389]: pam_unix(sudo:session): session closed for user root
Sep 30 06:44:32 compute-0 sudo[32541]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wobubiuccixjiiyovlwdnwfkuhtvanuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214671.904871-329-120419945316885/AnsiballZ_command.py'
Sep 30 06:44:32 compute-0 sudo[32541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:44:32 compute-0 python3.9[32543]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Sep 30 06:44:33 compute-0 sudo[32541]: pam_unix(sudo:session): session closed for user root
Sep 30 06:44:33 compute-0 sudo[32694]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chfmqskjervyeiuowlrmurtewnrocjzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214673.5845594-345-219728603169190/AnsiballZ_file.py'
Sep 30 06:44:33 compute-0 sudo[32694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:44:35 compute-0 python3.9[32696]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:44:35 compute-0 sudo[32694]: pam_unix(sudo:session): session closed for user root
Sep 30 06:44:37 compute-0 sudo[32846]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gestzfqasxoefnvxftshzzjpkkimilgy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214677.3670876-361-146925644583083/AnsiballZ_mount.py'
Sep 30 06:44:37 compute-0 sudo[32846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:44:38 compute-0 python3.9[32848]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Sep 30 06:44:38 compute-0 sudo[32846]: pam_unix(sudo:session): session closed for user root
Sep 30 06:44:39 compute-0 sudo[32998]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkiopalywhrsudsjgaqyzlwengzskfah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214678.9145691-417-100189495677067/AnsiballZ_file.py'
Sep 30 06:44:39 compute-0 sudo[32998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:44:43 compute-0 python3.9[33000]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:44:43 compute-0 sudo[32998]: pam_unix(sudo:session): session closed for user root
Sep 30 06:44:43 compute-0 sudo[33150]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwcwjayezebvsfxtwvblxcxtonsvkjtb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214683.4274018-433-86246744500770/AnsiballZ_stat.py'
Sep 30 06:44:43 compute-0 sudo[33150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:44:43 compute-0 python3.9[33152]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:44:43 compute-0 sudo[33150]: pam_unix(sudo:session): session closed for user root
Sep 30 06:44:44 compute-0 sudo[33275]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hojpokajdnordifnnziwnsfahvvylvnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214683.4274018-433-86246744500770/AnsiballZ_copy.py'
Sep 30 06:44:44 compute-0 sudo[33275]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:44:44 compute-0 python3.9[33277]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759214683.4274018-433-86246744500770/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f1630328830f6e47b98e9515af0d5e894e85cff4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:44:44 compute-0 sudo[33275]: pam_unix(sudo:session): session closed for user root
Sep 30 06:44:45 compute-0 sshd-session[33247]: Invalid user testuser from 152.32.253.152 port 54604
Sep 30 06:44:45 compute-0 sshd-session[33247]: Received disconnect from 152.32.253.152 port 54604:11: Bye Bye [preauth]
Sep 30 06:44:45 compute-0 sshd-session[33247]: Disconnected from invalid user testuser 152.32.253.152 port 54604 [preauth]
Sep 30 06:44:45 compute-0 sudo[33427]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nktgdkolndczbhixelvpdgzmptkzsxet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214685.4187193-487-265493134772314/AnsiballZ_getent.py'
Sep 30 06:44:45 compute-0 sudo[33427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:44:46 compute-0 python3.9[33429]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Sep 30 06:44:46 compute-0 sudo[33427]: pam_unix(sudo:session): session closed for user root
Sep 30 06:44:46 compute-0 sudo[33580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkpwegbfgrpghhemlzzxohohdhewhkfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214686.3851774-503-170093595550148/AnsiballZ_group.py'
Sep 30 06:44:46 compute-0 sudo[33580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:44:47 compute-0 python3.9[33582]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Sep 30 06:44:47 compute-0 groupadd[33583]: group added to /etc/group: name=qemu, GID=107
Sep 30 06:44:47 compute-0 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 06:44:47 compute-0 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 06:44:47 compute-0 groupadd[33583]: group added to /etc/gshadow: name=qemu
Sep 30 06:44:47 compute-0 groupadd[33583]: new group: name=qemu, GID=107
Sep 30 06:44:47 compute-0 sudo[33580]: pam_unix(sudo:session): session closed for user root
Sep 30 06:44:47 compute-0 sudo[33739]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qfiqtbcxdoviwrfrszlgczjctzjpsldd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214687.4390926-519-197111360119987/AnsiballZ_user.py'
Sep 30 06:44:47 compute-0 sudo[33739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:44:48 compute-0 python3.9[33741]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Sep 30 06:44:48 compute-0 useradd[33743]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/0
Sep 30 06:44:48 compute-0 sudo[33739]: pam_unix(sudo:session): session closed for user root
Sep 30 06:44:48 compute-0 sudo[33899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvbwpjtgdxflncvsrddtwzrioyglktkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214688.5841901-535-51661409932724/AnsiballZ_getent.py'
Sep 30 06:44:48 compute-0 sudo[33899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:44:49 compute-0 python3.9[33901]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Sep 30 06:44:49 compute-0 sudo[33899]: pam_unix(sudo:session): session closed for user root
Sep 30 06:44:49 compute-0 sudo[34052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wiulybwkhpvcncansnuxsfujrbfxryya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214689.3787794-551-173889644056746/AnsiballZ_group.py'
Sep 30 06:44:49 compute-0 sudo[34052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:44:49 compute-0 python3.9[34054]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Sep 30 06:44:49 compute-0 groupadd[34055]: group added to /etc/group: name=hugetlbfs, GID=42477
Sep 30 06:44:49 compute-0 groupadd[34055]: group added to /etc/gshadow: name=hugetlbfs
Sep 30 06:44:49 compute-0 groupadd[34055]: new group: name=hugetlbfs, GID=42477
Sep 30 06:44:49 compute-0 sudo[34052]: pam_unix(sudo:session): session closed for user root
Sep 30 06:44:50 compute-0 sudo[34210]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ecvlrakzuxzdvzrzkseuxlbqxbkldnko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214690.2398727-569-59240175781952/AnsiballZ_file.py'
Sep 30 06:44:50 compute-0 sudo[34210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:44:50 compute-0 python3.9[34212]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Sep 30 06:44:50 compute-0 sudo[34210]: pam_unix(sudo:session): session closed for user root
Sep 30 06:44:51 compute-0 sudo[34362]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xxjlopkvrodswwwwjlrukbdafxsybulr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214691.227667-591-10052363664821/AnsiballZ_dnf.py'
Sep 30 06:44:51 compute-0 sudo[34362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:44:51 compute-0 python3.9[34364]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 06:44:53 compute-0 sudo[34362]: pam_unix(sudo:session): session closed for user root
Sep 30 06:44:54 compute-0 sudo[34515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oeecpuyqpejjicejlowfekhdahfkacbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214693.7396257-607-117769705022507/AnsiballZ_file.py'
Sep 30 06:44:54 compute-0 sudo[34515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:44:54 compute-0 python3.9[34517]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:44:54 compute-0 sudo[34515]: pam_unix(sudo:session): session closed for user root
Sep 30 06:44:54 compute-0 sudo[34667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cllgaowbcnjpfejrkbbskozswtajwbbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214694.5444746-623-112274489251966/AnsiballZ_stat.py'
Sep 30 06:44:54 compute-0 sudo[34667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:44:55 compute-0 python3.9[34669]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:44:55 compute-0 sudo[34667]: pam_unix(sudo:session): session closed for user root
Sep 30 06:44:55 compute-0 sudo[34790]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajiniscieozpcjpkzrldkhrhputtdqcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214694.5444746-623-112274489251966/AnsiballZ_copy.py'
Sep 30 06:44:55 compute-0 sudo[34790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:44:55 compute-0 python3.9[34792]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759214694.5444746-623-112274489251966/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:44:55 compute-0 sudo[34790]: pam_unix(sudo:session): session closed for user root
Sep 30 06:44:56 compute-0 sudo[34942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwskshcjzowjskadincqkpgaaxnjhuqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214696.0241122-653-201424482568363/AnsiballZ_systemd.py'
Sep 30 06:44:56 compute-0 sudo[34942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:44:57 compute-0 python3.9[34944]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 06:44:57 compute-0 systemd[1]: Starting Load Kernel Modules...
Sep 30 06:44:57 compute-0 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Sep 30 06:44:57 compute-0 kernel: Bridge firewalling registered
Sep 30 06:44:57 compute-0 systemd-modules-load[34948]: Inserted module 'br_netfilter'
Sep 30 06:44:57 compute-0 systemd[1]: Finished Load Kernel Modules.
Sep 30 06:44:57 compute-0 sudo[34942]: pam_unix(sudo:session): session closed for user root
Sep 30 06:44:57 compute-0 sudo[35102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uyanxvdgldrvgydjyczgqkilnbybynjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214697.5916212-669-144858531157280/AnsiballZ_stat.py'
Sep 30 06:44:57 compute-0 sudo[35102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:44:58 compute-0 python3.9[35104]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:44:58 compute-0 sudo[35102]: pam_unix(sudo:session): session closed for user root
Sep 30 06:44:58 compute-0 sudo[35225]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pblqecusoisbfkrbpzzumkuxumpvegeq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214697.5916212-669-144858531157280/AnsiballZ_copy.py'
Sep 30 06:44:58 compute-0 sudo[35225]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:44:58 compute-0 python3.9[35227]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759214697.5916212-669-144858531157280/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:44:58 compute-0 sudo[35225]: pam_unix(sudo:session): session closed for user root
Sep 30 06:44:59 compute-0 sudo[35377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ordozltdxucrlpyrdgchzlxrdrwkyspp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214699.1789722-705-258759875196898/AnsiballZ_dnf.py'
Sep 30 06:44:59 compute-0 sudo[35377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:44:59 compute-0 python3.9[35379]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 06:45:03 compute-0 dbus-broker-launch[807]: Noticed file-system modification, trigger reload.
Sep 30 06:45:03 compute-0 dbus-broker-launch[807]: Noticed file-system modification, trigger reload.
Sep 30 06:45:04 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Sep 30 06:45:04 compute-0 systemd[1]: Starting man-db-cache-update.service...
Sep 30 06:45:04 compute-0 systemd[1]: Reloading.
Sep 30 06:45:04 compute-0 systemd-rc-local-generator[35440]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:45:04 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Sep 30 06:45:05 compute-0 sudo[35377]: pam_unix(sudo:session): session closed for user root
Sep 30 06:45:05 compute-0 python3.9[36646]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 06:45:06 compute-0 python3.9[37726]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Sep 30 06:45:07 compute-0 python3.9[38436]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 06:45:08 compute-0 sudo[39348]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgfutukuerbnbkxjbadznbqifunpptwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214708.083069-783-233673113565022/AnsiballZ_command.py'
Sep 30 06:45:08 compute-0 sudo[39348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:45:08 compute-0 python3.9[39362]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:45:08 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Sep 30 06:45:08 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Sep 30 06:45:08 compute-0 systemd[1]: Finished man-db-cache-update.service.
Sep 30 06:45:08 compute-0 systemd[1]: man-db-cache-update.service: Consumed 5.669s CPU time.
Sep 30 06:45:08 compute-0 systemd[1]: run-r4fea2f3c241a4b1d9dfbc76787a49e25.service: Deactivated successfully.
Sep 30 06:45:09 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Sep 30 06:45:09 compute-0 sudo[39348]: pam_unix(sudo:session): session closed for user root
Sep 30 06:45:09 compute-0 sudo[39955]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwizjrwytwdpbpkmtydquyhhggozgnlp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214709.504384-801-213630903277767/AnsiballZ_systemd.py'
Sep 30 06:45:09 compute-0 sudo[39955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:45:10 compute-0 python3.9[39957]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 06:45:10 compute-0 systemd[1]: Stopping Dynamic System Tuning Daemon...
Sep 30 06:45:10 compute-0 systemd[1]: tuned.service: Deactivated successfully.
Sep 30 06:45:10 compute-0 systemd[1]: Stopped Dynamic System Tuning Daemon.
Sep 30 06:45:10 compute-0 systemd[1]: Starting Dynamic System Tuning Daemon...
Sep 30 06:45:10 compute-0 systemd[1]: Started Dynamic System Tuning Daemon.
Sep 30 06:45:10 compute-0 sudo[39955]: pam_unix(sudo:session): session closed for user root
Sep 30 06:45:11 compute-0 python3.9[40118]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Sep 30 06:45:13 compute-0 sudo[40268]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iffezwyfxdwolpxfqwlhofzdmshbkbky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214713.4414334-915-201934330201803/AnsiballZ_systemd.py'
Sep 30 06:45:13 compute-0 sudo[40268]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:45:14 compute-0 python3.9[40270]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 06:45:14 compute-0 systemd[1]: Reloading.
Sep 30 06:45:14 compute-0 systemd-rc-local-generator[40300]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:45:14 compute-0 sudo[40268]: pam_unix(sudo:session): session closed for user root
Sep 30 06:45:15 compute-0 sudo[40457]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfhsvgmegogiecztmjndgohgmsnqwpbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214714.5219822-915-265934905617742/AnsiballZ_systemd.py'
Sep 30 06:45:15 compute-0 sudo[40457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:45:15 compute-0 python3.9[40459]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 06:45:15 compute-0 systemd[1]: Reloading.
Sep 30 06:45:15 compute-0 systemd-rc-local-generator[40486]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:45:15 compute-0 sudo[40457]: pam_unix(sudo:session): session closed for user root
Sep 30 06:45:16 compute-0 sudo[40646]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxktjkgsapzronambumzyyvggctqmvrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214716.0116746-947-245826129047103/AnsiballZ_command.py'
Sep 30 06:45:16 compute-0 sudo[40646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:45:16 compute-0 python3.9[40648]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:45:16 compute-0 sudo[40646]: pam_unix(sudo:session): session closed for user root
Sep 30 06:45:17 compute-0 sudo[40799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-boorjjsqjfqhcptnsukectyeqiigdvlg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214716.7730072-963-110307330924689/AnsiballZ_command.py'
Sep 30 06:45:17 compute-0 sudo[40799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:45:17 compute-0 python3.9[40801]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:45:17 compute-0 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Sep 30 06:45:17 compute-0 sudo[40799]: pam_unix(sudo:session): session closed for user root
Sep 30 06:45:17 compute-0 sudo[40952]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqinteecntqkgcdxnjmfuhwlmuhczaao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214717.5951211-979-196197310850821/AnsiballZ_command.py'
Sep 30 06:45:17 compute-0 sudo[40952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:45:18 compute-0 python3.9[40954]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:45:19 compute-0 sudo[40952]: pam_unix(sudo:session): session closed for user root
Sep 30 06:45:20 compute-0 sudo[41114]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iybmiwiwjfwiuvetrpylexvonwccsxab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214720.005667-995-240888590277567/AnsiballZ_command.py'
Sep 30 06:45:20 compute-0 sudo[41114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:45:20 compute-0 python3.9[41116]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:45:20 compute-0 sudo[41114]: pam_unix(sudo:session): session closed for user root
Sep 30 06:45:21 compute-0 sudo[41267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msepbclnhdtkbejsjkuqjepbdustiwio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214720.8341424-1011-64668844657965/AnsiballZ_systemd.py'
Sep 30 06:45:21 compute-0 sudo[41267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:45:21 compute-0 python3.9[41269]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 06:45:21 compute-0 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Sep 30 06:45:21 compute-0 systemd[1]: Stopped Apply Kernel Variables.
Sep 30 06:45:21 compute-0 systemd[1]: Stopping Apply Kernel Variables...
Sep 30 06:45:21 compute-0 systemd[1]: Starting Apply Kernel Variables...
Sep 30 06:45:21 compute-0 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Sep 30 06:45:21 compute-0 systemd[1]: Finished Apply Kernel Variables.
Sep 30 06:45:21 compute-0 sudo[41267]: pam_unix(sudo:session): session closed for user root
Sep 30 06:45:22 compute-0 sshd-session[28262]: Connection closed by 192.168.122.30 port 50004
Sep 30 06:45:22 compute-0 sshd-session[28259]: pam_unix(sshd:session): session closed for user zuul
Sep 30 06:45:22 compute-0 systemd[1]: session-11.scope: Deactivated successfully.
Sep 30 06:45:22 compute-0 systemd[1]: session-11.scope: Consumed 2min 15.169s CPU time.
Sep 30 06:45:22 compute-0 systemd-logind[824]: Session 11 logged out. Waiting for processes to exit.
Sep 30 06:45:22 compute-0 systemd-logind[824]: Removed session 11.
Sep 30 06:45:27 compute-0 sshd-session[41299]: Accepted publickey for zuul from 192.168.122.30 port 49976 ssh2: ECDSA SHA256:VgXY+3KEFg6ByVjpOVk/qpSKqXtLqTtx1W0gQMfs9wE
Sep 30 06:45:27 compute-0 systemd-logind[824]: New session 12 of user zuul.
Sep 30 06:45:27 compute-0 systemd[1]: Started Session 12 of User zuul.
Sep 30 06:45:27 compute-0 sshd-session[41299]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 06:45:28 compute-0 python3.9[41452]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 06:45:29 compute-0 python3.9[41606]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 06:45:30 compute-0 sudo[41760]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgqzoglcvisglnfqtgxconmgbzmdleve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214729.9788795-80-39999534370759/AnsiballZ_command.py'
Sep 30 06:45:30 compute-0 sudo[41760]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:45:30 compute-0 python3.9[41762]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:45:30 compute-0 sudo[41760]: pam_unix(sudo:session): session closed for user root
Sep 30 06:45:31 compute-0 python3.9[41913]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 06:45:32 compute-0 sudo[42067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlxanwtxgzqmkqcrwapaqawirbnhjxve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214732.1749616-120-92154317964784/AnsiballZ_setup.py'
Sep 30 06:45:32 compute-0 sudo[42067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:45:32 compute-0 python3.9[42069]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 06:45:33 compute-0 sudo[42067]: pam_unix(sudo:session): session closed for user root
Sep 30 06:45:33 compute-0 sudo[42151]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tucanhhbrdtulrrfvhwkjulxagkadmox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214732.1749616-120-92154317964784/AnsiballZ_dnf.py'
Sep 30 06:45:33 compute-0 sudo[42151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:45:33 compute-0 python3.9[42153]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 06:45:35 compute-0 sudo[42151]: pam_unix(sudo:session): session closed for user root
Sep 30 06:45:35 compute-0 sudo[42304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spsnvgskyknfimgyqpyzbmqefbzczeey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214735.2945259-144-269763010362332/AnsiballZ_setup.py'
Sep 30 06:45:35 compute-0 sudo[42304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:45:36 compute-0 python3.9[42306]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 06:45:36 compute-0 sudo[42304]: pam_unix(sudo:session): session closed for user root
Sep 30 06:45:36 compute-0 sudo[42475]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atkevnvsoardmgzpuxtgfnjyegodfdob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214736.486443-166-117264269148859/AnsiballZ_file.py'
Sep 30 06:45:36 compute-0 sudo[42475]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:45:37 compute-0 python3.9[42477]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:45:37 compute-0 sudo[42475]: pam_unix(sudo:session): session closed for user root
Sep 30 06:45:37 compute-0 sudo[42627]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxpthuiljxmofimpdxochykmqwxzmccl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214737.424552-182-130534758295510/AnsiballZ_command.py'
Sep 30 06:45:37 compute-0 sudo[42627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:45:38 compute-0 python3.9[42629]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:45:38 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat2136523628-merged.mount: Deactivated successfully.
Sep 30 06:45:38 compute-0 systemd[1]: var-lib-containers-storage-overlay-metacopy\x2dcheck2914264221-merged.mount: Deactivated successfully.
Sep 30 06:45:38 compute-0 podman[42630]: 2025-09-30 06:45:38.252815395 +0000 UTC m=+0.215899756 system refresh
Sep 30 06:45:38 compute-0 sudo[42627]: pam_unix(sudo:session): session closed for user root
Sep 30 06:45:39 compute-0 sudo[42791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubrmhibmmgawvgyrcwufnmgzfyfmzjef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214738.478212-198-227847054966674/AnsiballZ_stat.py'
Sep 30 06:45:39 compute-0 sudo[42791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:45:39 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 06:45:39 compute-0 python3.9[42793]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:45:39 compute-0 sudo[42791]: pam_unix(sudo:session): session closed for user root
Sep 30 06:45:39 compute-0 sudo[42914]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkaelzxzhtgkjiaodlauisbavtqyzqki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214738.478212-198-227847054966674/AnsiballZ_copy.py'
Sep 30 06:45:39 compute-0 sudo[42914]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:45:39 compute-0 python3.9[42916]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759214738.478212-198-227847054966674/.source.json follow=False _original_basename=podman_network_config.j2 checksum=583a55ca9cdcb6c37e886549af3cab2485db222b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:45:39 compute-0 sudo[42914]: pam_unix(sudo:session): session closed for user root
Sep 30 06:45:40 compute-0 sudo[43066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thaiydjytcynyvkdkiyzpumdetlkblmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214740.1411753-228-122968447517448/AnsiballZ_stat.py'
Sep 30 06:45:40 compute-0 sudo[43066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:45:40 compute-0 python3.9[43068]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:45:40 compute-0 sudo[43066]: pam_unix(sudo:session): session closed for user root
Sep 30 06:45:41 compute-0 sudo[43189]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pffrgcvhebfztrhxesjubbxouwjwbrez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214740.1411753-228-122968447517448/AnsiballZ_copy.py'
Sep 30 06:45:41 compute-0 sudo[43189]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:45:41 compute-0 python3.9[43191]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759214740.1411753-228-122968447517448/.source.conf follow=False _original_basename=registries.conf.j2 checksum=b723c254c5347521a0bd9978182359a7d08823fc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:45:41 compute-0 sudo[43189]: pam_unix(sudo:session): session closed for user root
Sep 30 06:45:42 compute-0 sudo[43341]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbpzjikynxmzskqusdirloeljewfzpum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214741.747317-260-269359771868866/AnsiballZ_ini_file.py'
Sep 30 06:45:42 compute-0 sudo[43341]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:45:42 compute-0 python3.9[43343]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:45:42 compute-0 sudo[43341]: pam_unix(sudo:session): session closed for user root
Sep 30 06:45:43 compute-0 sudo[43493]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpzfiaxeqhtthnrmpncybmenqvgeyhly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214742.6538134-260-214776148623239/AnsiballZ_ini_file.py'
Sep 30 06:45:43 compute-0 sudo[43493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:45:43 compute-0 python3.9[43495]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:45:43 compute-0 sudo[43493]: pam_unix(sudo:session): session closed for user root
Sep 30 06:45:43 compute-0 sudo[43645]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvujgaxcftuvazwrxpcrballvwjsziwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214743.4758205-260-227252415815017/AnsiballZ_ini_file.py'
Sep 30 06:45:43 compute-0 sudo[43645]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:45:43 compute-0 python3.9[43647]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:45:44 compute-0 sudo[43645]: pam_unix(sudo:session): session closed for user root
Sep 30 06:45:44 compute-0 sudo[43797]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pklmgvcqkbfrpruoeraufrlckvjrmeou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214744.184623-260-59626787254111/AnsiballZ_ini_file.py'
Sep 30 06:45:44 compute-0 sudo[43797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:45:44 compute-0 python3.9[43799]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:45:44 compute-0 sudo[43797]: pam_unix(sudo:session): session closed for user root
Sep 30 06:45:45 compute-0 python3.9[43949]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 06:45:46 compute-0 sudo[44101]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spuqtrmihwndrcvmohleoyxajnfrfirk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214745.9815626-340-134334678218904/AnsiballZ_dnf.py'
Sep 30 06:45:46 compute-0 sudo[44101]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:45:46 compute-0 python3.9[44103]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Sep 30 06:45:47 compute-0 sshd-session[44104]: Invalid user admin from 152.32.253.152 port 49918
Sep 30 06:45:47 compute-0 sshd-session[44104]: Received disconnect from 152.32.253.152 port 49918:11: Bye Bye [preauth]
Sep 30 06:45:47 compute-0 sshd-session[44104]: Disconnected from invalid user admin 152.32.253.152 port 49918 [preauth]
Sep 30 06:45:47 compute-0 sudo[44101]: pam_unix(sudo:session): session closed for user root
Sep 30 06:45:48 compute-0 sudo[44256]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ksgqutbgitdrdyvglrhtfwpfuhvzuker ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214747.8848073-356-111783416657112/AnsiballZ_dnf.py'
Sep 30 06:45:48 compute-0 sudo[44256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:45:48 compute-0 python3.9[44258]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Sep 30 06:45:50 compute-0 sudo[44256]: pam_unix(sudo:session): session closed for user root
Sep 30 06:45:50 compute-0 sudo[44416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fioiqarmgxeepvaprfnrpxfvloruwqur ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214750.6615233-376-57366835252783/AnsiballZ_dnf.py'
Sep 30 06:45:50 compute-0 sudo[44416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:45:51 compute-0 python3.9[44418]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Sep 30 06:45:52 compute-0 sudo[44416]: pam_unix(sudo:session): session closed for user root
Sep 30 06:45:53 compute-0 sudo[44569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xagyafgwqeggzrjctaiujfyeasdcsnxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214752.8179631-394-93435942602190/AnsiballZ_dnf.py'
Sep 30 06:45:53 compute-0 sudo[44569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:45:53 compute-0 python3.9[44571]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Sep 30 06:45:54 compute-0 sudo[44569]: pam_unix(sudo:session): session closed for user root
Sep 30 06:45:55 compute-0 sudo[44722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxdqhgjujivyilxfozmmbthjtewszefy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214755.2594888-416-146265596153576/AnsiballZ_dnf.py'
Sep 30 06:45:55 compute-0 sudo[44722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:45:55 compute-0 python3.9[44724]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Sep 30 06:45:57 compute-0 sudo[44722]: pam_unix(sudo:session): session closed for user root
Sep 30 06:45:57 compute-0 sudo[44878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ptsmohkapntqpgxcvvzjorvugnhfupcn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214757.61648-432-60888177672596/AnsiballZ_dnf.py'
Sep 30 06:45:57 compute-0 sudo[44878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:45:58 compute-0 python3.9[44880]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Sep 30 06:46:01 compute-0 sudo[44878]: pam_unix(sudo:session): session closed for user root
Sep 30 06:46:02 compute-0 sudo[45047]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgosgrdpmbzxluwfkiptmjrxxkfbaldy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214762.3176882-450-170793294809492/AnsiballZ_dnf.py'
Sep 30 06:46:02 compute-0 sudo[45047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:46:02 compute-0 python3.9[45049]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Sep 30 06:46:04 compute-0 sudo[45047]: pam_unix(sudo:session): session closed for user root
Sep 30 06:46:04 compute-0 sudo[45200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nggktzymbixoohzcohcifsdptewuypme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214764.6166613-468-126636977027977/AnsiballZ_dnf.py'
Sep 30 06:46:04 compute-0 sudo[45200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:46:05 compute-0 python3.9[45202]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Sep 30 06:46:17 compute-0 sudo[45200]: pam_unix(sudo:session): session closed for user root
Sep 30 06:46:18 compute-0 sudo[45538]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-asokjbyogkqgrukjgltkgpiyrinklwhz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214777.6487892-490-239084454174908/AnsiballZ_file.py'
Sep 30 06:46:18 compute-0 sudo[45538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:46:18 compute-0 python3.9[45540]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:46:18 compute-0 sudo[45538]: pam_unix(sudo:session): session closed for user root
Sep 30 06:46:18 compute-0 sudo[45713]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ujuwqjcqxxvnlegkosielsfpnuvxmduk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214778.4918752-506-243331279169416/AnsiballZ_stat.py'
Sep 30 06:46:18 compute-0 sudo[45713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:46:19 compute-0 python3.9[45715]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:46:19 compute-0 sudo[45713]: pam_unix(sudo:session): session closed for user root
Sep 30 06:46:19 compute-0 sudo[45836]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oymwppbebpfiebolomzbvpdrhehzlllk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214778.4918752-506-243331279169416/AnsiballZ_copy.py'
Sep 30 06:46:19 compute-0 sudo[45836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:46:19 compute-0 python3.9[45838]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1759214778.4918752-506-243331279169416/.source.json _original_basename=.36lxg1v3 follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:46:19 compute-0 sudo[45836]: pam_unix(sudo:session): session closed for user root
Sep 30 06:46:20 compute-0 sudo[45988]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wuijqizlphkkhokqjifmcosekkaigcft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214780.187953-542-29987115583589/AnsiballZ_podman_image.py'
Sep 30 06:46:20 compute-0 sudo[45988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:46:20 compute-0 python3.9[45990]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Sep 30 06:46:20 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 06:46:23 compute-0 systemd[1]: var-lib-containers-storage-overlay-compat1273101282-lower\x2dmapped.mount: Deactivated successfully.
Sep 30 06:46:28 compute-0 podman[46003]: 2025-09-30 06:46:28.857006313 +0000 UTC m=+7.847316259 image pull 36e09fb90e558c69a5cd1d9e675a0cddae2912ee81c1af712f9b1ec1a4a5791d 38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest
Sep 30 06:46:28 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 06:46:28 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 06:46:28 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 06:46:29 compute-0 sudo[45988]: pam_unix(sudo:session): session closed for user root
Sep 30 06:46:29 compute-0 sudo[46298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtgikvxiugicmwzycaufvwqiiienilum ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214789.43821-560-277404142574704/AnsiballZ_podman_image.py'
Sep 30 06:46:29 compute-0 sudo[46298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:46:30 compute-0 python3.9[46300]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Sep 30 06:46:30 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 06:46:31 compute-0 podman[46312]: 2025-09-30 06:46:31.634647202 +0000 UTC m=+1.583357597 image pull cde2f83a5b58d30e0d6b5c078f3d1ef55892b021d0701b493b9597be9f28e4aa 38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest
Sep 30 06:46:31 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 06:46:31 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 06:46:31 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 06:46:31 compute-0 sudo[46298]: pam_unix(sudo:session): session closed for user root
Sep 30 06:46:32 compute-0 sudo[46567]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmkndudwiaebomaxwqopmftiricuncnf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214792.4004438-582-94539022180430/AnsiballZ_podman_image.py'
Sep 30 06:46:32 compute-0 sudo[46567]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:46:32 compute-0 python3.9[46569]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Sep 30 06:46:33 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 06:46:41 compute-0 podman[46582]: 2025-09-30 06:46:41.724674211 +0000 UTC m=+8.666669973 image pull eeebcc09bc72f81ab45f5ab87eb8f6a7b554b949227aeec082bdb0732754ddc8 38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 06:46:41 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 06:46:41 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 06:46:41 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 06:46:42 compute-0 sudo[46567]: pam_unix(sudo:session): session closed for user root
Sep 30 06:46:42 compute-0 sudo[46862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-slhshbhnmlwdzvutfablaqygddwrucrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214802.386521-602-64936640342243/AnsiballZ_podman_image.py'
Sep 30 06:46:42 compute-0 sudo[46862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:46:42 compute-0 python3.9[46864]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Sep 30 06:46:43 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 06:46:43 compute-0 podman[46876]: 2025-09-30 06:46:43.493949004 +0000 UTC m=+0.459243991 image pull 0fb6856fe8f53101c9a68be625474646cbb6c5306dfa9570ef7defb7c487fcd5 38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest
Sep 30 06:46:43 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 06:46:43 compute-0 sudo[46862]: pam_unix(sudo:session): session closed for user root
Sep 30 06:46:44 compute-0 sudo[47109]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxvqpdqlwlfduqitfwwhlngyrqpnqfdb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214804.0613267-620-10883991411305/AnsiballZ_podman_image.py'
Sep 30 06:46:44 compute-0 sudo[47109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:46:44 compute-0 python3.9[47111]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Sep 30 06:46:44 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 06:46:50 compute-0 sshd-session[47164]: Invalid user foundry from 152.32.253.152 port 45234
Sep 30 06:46:51 compute-0 sshd-session[47164]: Received disconnect from 152.32.253.152 port 45234:11: Bye Bye [preauth]
Sep 30 06:46:51 compute-0 sshd-session[47164]: Disconnected from invalid user foundry 152.32.253.152 port 45234 [preauth]
Sep 30 06:46:55 compute-0 podman[47123]: 2025-09-30 06:46:55.384748158 +0000 UTC m=+10.738647333 image pull b4e0ba921b5ecb84b5b785b68bb6d15e43854720aa99c361795320d2a08a3eee 38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest
Sep 30 06:46:55 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 06:46:55 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 06:46:55 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 06:46:55 compute-0 sudo[47109]: pam_unix(sudo:session): session closed for user root
Sep 30 06:46:56 compute-0 sudo[47383]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtdjcxqnufzjwmyrpxvkwcqxquupfogp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214815.998399-642-113856149600995/AnsiballZ_podman_image.py'
Sep 30 06:46:56 compute-0 sudo[47383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:46:56 compute-0 python3.9[47385]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=38.102.83.30:5001/podified-master-centos10/openstack-ceilometer-compute:watcher_latest tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Sep 30 06:46:56 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 06:47:00 compute-0 podman[47396]: 2025-09-30 06:47:00.689526411 +0000 UTC m=+4.099454203 image pull cb7788b907635032067af0c50a359407f6152974b9280ecf95a1ef9d3d70aa41 38.102.83.30:5001/podified-master-centos10/openstack-ceilometer-compute:watcher_latest
Sep 30 06:47:00 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 06:47:00 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 06:47:00 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 06:47:00 compute-0 sudo[47383]: pam_unix(sudo:session): session closed for user root
Sep 30 06:47:01 compute-0 sudo[47651]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afmfdfwuuabzkysjutsnpufuviayoitk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214821.1217346-642-193857495036143/AnsiballZ_podman_image.py'
Sep 30 06:47:01 compute-0 sudo[47651]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:47:01 compute-0 python3.9[47653]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Sep 30 06:47:02 compute-0 podman[47665]: 2025-09-30 06:47:02.867583111 +0000 UTC m=+1.151454284 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Sep 30 06:47:02 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 06:47:02 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 06:47:02 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 06:47:03 compute-0 sudo[47651]: pam_unix(sudo:session): session closed for user root
Sep 30 06:47:03 compute-0 sshd-session[41302]: Connection closed by 192.168.122.30 port 49976
Sep 30 06:47:03 compute-0 sshd-session[41299]: pam_unix(sshd:session): session closed for user zuul
Sep 30 06:47:03 compute-0 systemd[1]: session-12.scope: Deactivated successfully.
Sep 30 06:47:03 compute-0 systemd[1]: session-12.scope: Consumed 1min 57.269s CPU time.
Sep 30 06:47:03 compute-0 systemd-logind[824]: Session 12 logged out. Waiting for processes to exit.
Sep 30 06:47:03 compute-0 systemd-logind[824]: Removed session 12.
Sep 30 06:47:09 compute-0 sshd-session[47814]: Accepted publickey for zuul from 192.168.122.30 port 41042 ssh2: ECDSA SHA256:VgXY+3KEFg6ByVjpOVk/qpSKqXtLqTtx1W0gQMfs9wE
Sep 30 06:47:09 compute-0 systemd-logind[824]: New session 13 of user zuul.
Sep 30 06:47:09 compute-0 systemd[1]: Started Session 13 of User zuul.
Sep 30 06:47:09 compute-0 sshd-session[47814]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 06:47:10 compute-0 python3.9[47967]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 06:47:11 compute-0 sudo[48121]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkdvxfalywxsjivmdcrvpkezptcjcyns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214831.3829873-52-121409824600759/AnsiballZ_getent.py'
Sep 30 06:47:11 compute-0 sudo[48121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:47:12 compute-0 python3.9[48123]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Sep 30 06:47:12 compute-0 sudo[48121]: pam_unix(sudo:session): session closed for user root
Sep 30 06:47:12 compute-0 sudo[48274]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbsezlpiemywwavvypmcxmpvzlbvxfvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214832.3451781-68-171225282450884/AnsiballZ_group.py'
Sep 30 06:47:12 compute-0 sudo[48274]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:47:13 compute-0 python3.9[48276]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Sep 30 06:47:13 compute-0 groupadd[48277]: group added to /etc/group: name=openvswitch, GID=42476
Sep 30 06:47:13 compute-0 groupadd[48277]: group added to /etc/gshadow: name=openvswitch
Sep 30 06:47:13 compute-0 groupadd[48277]: new group: name=openvswitch, GID=42476
Sep 30 06:47:13 compute-0 sudo[48274]: pam_unix(sudo:session): session closed for user root
Sep 30 06:47:13 compute-0 sudo[48432]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iyicenmqdgnohkjxhdqhkgldqujfzpzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214833.3515487-84-28415023990544/AnsiballZ_user.py'
Sep 30 06:47:13 compute-0 sudo[48432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:47:14 compute-0 python3.9[48434]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Sep 30 06:47:14 compute-0 useradd[48436]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/0
Sep 30 06:47:14 compute-0 useradd[48436]: add 'openvswitch' to group 'hugetlbfs'
Sep 30 06:47:14 compute-0 useradd[48436]: add 'openvswitch' to shadow group 'hugetlbfs'
Sep 30 06:47:14 compute-0 sudo[48432]: pam_unix(sudo:session): session closed for user root
Sep 30 06:47:14 compute-0 sudo[48592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eppztcdlspqhsezhduuodgoqzdbqramb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214834.5960178-104-90256205958588/AnsiballZ_setup.py'
Sep 30 06:47:14 compute-0 sudo[48592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:47:15 compute-0 python3.9[48594]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 06:47:15 compute-0 sudo[48592]: pam_unix(sudo:session): session closed for user root
Sep 30 06:47:15 compute-0 sudo[48676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfueushmoxhfukqjtwqglzyhygtubbyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214834.5960178-104-90256205958588/AnsiballZ_dnf.py'
Sep 30 06:47:15 compute-0 sudo[48676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:47:16 compute-0 python3.9[48678]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Sep 30 06:47:17 compute-0 sudo[48676]: pam_unix(sudo:session): session closed for user root
Sep 30 06:47:18 compute-0 sudo[48837]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yuurvhamkatynfaoxunskolozbvpooid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214838.055834-132-26769284180862/AnsiballZ_dnf.py'
Sep 30 06:47:18 compute-0 sudo[48837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:47:18 compute-0 python3.9[48839]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 06:47:30 compute-0 kernel: SELinux:  Converting 2725 SID table entries...
Sep 30 06:47:30 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 06:47:30 compute-0 kernel: SELinux:  policy capability open_perms=1
Sep 30 06:47:30 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 06:47:30 compute-0 kernel: SELinux:  policy capability always_check_network=0
Sep 30 06:47:30 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 06:47:30 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 06:47:30 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 06:47:30 compute-0 groupadd[48862]: group added to /etc/group: name=unbound, GID=993
Sep 30 06:47:30 compute-0 groupadd[48862]: group added to /etc/gshadow: name=unbound
Sep 30 06:47:30 compute-0 groupadd[48862]: new group: name=unbound, GID=993
Sep 30 06:47:30 compute-0 useradd[48869]: new user: name=unbound, UID=993, GID=993, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Sep 30 06:47:30 compute-0 dbus-broker-launch[814]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Sep 30 06:47:30 compute-0 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Sep 30 06:47:32 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Sep 30 06:47:32 compute-0 systemd[1]: Starting man-db-cache-update.service...
Sep 30 06:47:32 compute-0 systemd[1]: Reloading.
Sep 30 06:47:32 compute-0 systemd-rc-local-generator[49379]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:47:32 compute-0 systemd-sysv-generator[49384]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 06:47:32 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Sep 30 06:47:33 compute-0 sudo[48837]: pam_unix(sudo:session): session closed for user root
Sep 30 06:47:33 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Sep 30 06:47:33 compute-0 systemd[1]: Finished man-db-cache-update.service.
Sep 30 06:47:33 compute-0 systemd[1]: man-db-cache-update.service: Consumed 1.005s CPU time.
Sep 30 06:47:33 compute-0 systemd[1]: run-r10e971035fea4461a2dfc3b2620a9a91.service: Deactivated successfully.
Sep 30 06:47:34 compute-0 sudo[49939]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awwihnvtmapsalfxqwuhhtgpvmxmyykf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214853.3492763-148-98441976451076/AnsiballZ_systemd.py'
Sep 30 06:47:34 compute-0 sudo[49939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:47:34 compute-0 python3.9[49941]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Sep 30 06:47:34 compute-0 systemd[1]: Reloading.
Sep 30 06:47:34 compute-0 systemd-rc-local-generator[49968]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:47:34 compute-0 systemd-sysv-generator[49973]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 06:47:34 compute-0 systemd[1]: Starting Open vSwitch Database Unit...
Sep 30 06:47:34 compute-0 chown[49982]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Sep 30 06:47:34 compute-0 ovs-ctl[49987]: /etc/openvswitch/conf.db does not exist ... (warning).
Sep 30 06:47:34 compute-0 ovs-ctl[49987]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Sep 30 06:47:34 compute-0 ovs-ctl[49987]: Starting ovsdb-server [  OK  ]
Sep 30 06:47:34 compute-0 ovs-vsctl[50036]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Sep 30 06:47:35 compute-0 ovs-vsctl[50056]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"01429670-4ea1-4dab-babc-4bc628cc01bb\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Sep 30 06:47:35 compute-0 ovs-ctl[49987]: Configuring Open vSwitch system IDs [  OK  ]
Sep 30 06:47:35 compute-0 ovs-ctl[49987]: Enabling remote OVSDB managers [  OK  ]
Sep 30 06:47:35 compute-0 systemd[1]: Started Open vSwitch Database Unit.
Sep 30 06:47:35 compute-0 ovs-vsctl[50062]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Sep 30 06:47:35 compute-0 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Sep 30 06:47:35 compute-0 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Sep 30 06:47:35 compute-0 systemd[1]: Starting Open vSwitch Forwarding Unit...
Sep 30 06:47:35 compute-0 kernel: openvswitch: Open vSwitch switching datapath
Sep 30 06:47:35 compute-0 ovs-ctl[50106]: Inserting openvswitch module [  OK  ]
Sep 30 06:47:35 compute-0 ovs-ctl[50075]: Starting ovs-vswitchd [  OK  ]
Sep 30 06:47:35 compute-0 ovs-vsctl[50126]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-0
Sep 30 06:47:35 compute-0 ovs-ctl[50075]: Enabling remote OVSDB managers [  OK  ]
Sep 30 06:47:35 compute-0 systemd[1]: Started Open vSwitch Forwarding Unit.
Sep 30 06:47:35 compute-0 systemd[1]: Starting Open vSwitch...
Sep 30 06:47:35 compute-0 systemd[1]: Finished Open vSwitch.
Sep 30 06:47:35 compute-0 sudo[49939]: pam_unix(sudo:session): session closed for user root
Sep 30 06:47:36 compute-0 python3.9[50278]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 06:47:37 compute-0 sudo[50428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oxwlnupqnjagrkmumoahoxqwagrmaecw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214856.8372805-184-109557944913777/AnsiballZ_sefcontext.py'
Sep 30 06:47:37 compute-0 sudo[50428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:47:37 compute-0 python3.9[50430]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Sep 30 06:47:38 compute-0 kernel: SELinux:  Converting 2739 SID table entries...
Sep 30 06:47:38 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 06:47:38 compute-0 kernel: SELinux:  policy capability open_perms=1
Sep 30 06:47:38 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 06:47:38 compute-0 kernel: SELinux:  policy capability always_check_network=0
Sep 30 06:47:38 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 06:47:38 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 06:47:38 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 06:47:39 compute-0 sudo[50428]: pam_unix(sudo:session): session closed for user root
Sep 30 06:47:39 compute-0 python3.9[50585]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 06:47:40 compute-0 sudo[50741]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mizbhgmdqumhkkcvvhqoaogmdhdvubwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214860.4709895-220-31591212727218/AnsiballZ_dnf.py'
Sep 30 06:47:40 compute-0 dbus-broker-launch[814]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Sep 30 06:47:40 compute-0 sudo[50741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:47:41 compute-0 python3.9[50743]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 06:47:42 compute-0 sudo[50741]: pam_unix(sudo:session): session closed for user root
Sep 30 06:47:43 compute-0 sudo[50894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqiwiavmrewwwiscuxnlrbcyacwkrhfp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214862.6136281-236-68334944415320/AnsiballZ_command.py'
Sep 30 06:47:43 compute-0 sudo[50894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:47:43 compute-0 python3.9[50896]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:47:44 compute-0 sudo[50894]: pam_unix(sudo:session): session closed for user root
Sep 30 06:47:44 compute-0 sudo[51181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntlxipfwmwumkszevklzleccdprxzgbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214864.3476224-252-196981568225428/AnsiballZ_file.py'
Sep 30 06:47:44 compute-0 sudo[51181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:47:45 compute-0 python3.9[51183]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Sep 30 06:47:45 compute-0 sudo[51181]: pam_unix(sudo:session): session closed for user root
Sep 30 06:47:46 compute-0 python3.9[51333]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 06:47:46 compute-0 sudo[51485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgrcpivgmfcggovbthsdfynofisnemae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214866.2604926-284-106574712200364/AnsiballZ_dnf.py'
Sep 30 06:47:46 compute-0 sudo[51485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:47:46 compute-0 python3.9[51487]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 06:47:48 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Sep 30 06:47:48 compute-0 systemd[1]: Starting man-db-cache-update.service...
Sep 30 06:47:48 compute-0 systemd[1]: Reloading.
Sep 30 06:47:48 compute-0 systemd-sysv-generator[51529]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 06:47:48 compute-0 systemd-rc-local-generator[51526]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:47:49 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Sep 30 06:47:49 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Sep 30 06:47:49 compute-0 systemd[1]: Finished man-db-cache-update.service.
Sep 30 06:47:49 compute-0 systemd[1]: run-ra7e7dd2df3154ec6af096153711b6e8e.service: Deactivated successfully.
Sep 30 06:47:49 compute-0 sudo[51485]: pam_unix(sudo:session): session closed for user root
Sep 30 06:47:50 compute-0 sudo[51801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ertekawnygrfsepnhwwukdrwxjwwmyyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214869.7015293-300-89225271315901/AnsiballZ_systemd.py'
Sep 30 06:47:50 compute-0 sudo[51801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:47:50 compute-0 python3.9[51803]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 06:47:50 compute-0 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Sep 30 06:47:50 compute-0 systemd[1]: Stopped Network Manager Wait Online.
Sep 30 06:47:50 compute-0 systemd[1]: Stopping Network Manager Wait Online...
Sep 30 06:47:50 compute-0 systemd[1]: Stopping Network Manager...
Sep 30 06:47:50 compute-0 NetworkManager[3981]: <info>  [1759214870.4871] caught SIGTERM, shutting down normally.
Sep 30 06:47:50 compute-0 NetworkManager[3981]: <info>  [1759214870.4902] dhcp4 (eth0): canceled DHCP transaction
Sep 30 06:47:50 compute-0 NetworkManager[3981]: <info>  [1759214870.4902] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Sep 30 06:47:50 compute-0 NetworkManager[3981]: <info>  [1759214870.4903] dhcp4 (eth0): state changed no lease
Sep 30 06:47:50 compute-0 NetworkManager[3981]: <info>  [1759214870.4910] manager: NetworkManager state is now CONNECTED_SITE
Sep 30 06:47:50 compute-0 NetworkManager[3981]: <info>  [1759214870.5033] exiting (success)
Sep 30 06:47:50 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Sep 30 06:47:50 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Sep 30 06:47:50 compute-0 systemd[1]: NetworkManager.service: Deactivated successfully.
Sep 30 06:47:50 compute-0 systemd[1]: Stopped Network Manager.
Sep 30 06:47:50 compute-0 systemd[1]: NetworkManager.service: Consumed 14.705s CPU time, 4.1M memory peak, read 0B from disk, written 30.0K to disk.
Sep 30 06:47:50 compute-0 systemd[1]: Starting Network Manager...
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.5875] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:5ec163b0-1932-4293-bd17-8c478fff576e)
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.5878] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.5958] manager[0x5607708ea090]: monitoring kernel firmware directory '/lib/firmware'.
Sep 30 06:47:50 compute-0 systemd[1]: Starting Hostname Service...
Sep 30 06:47:50 compute-0 systemd[1]: Started Hostname Service.
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.6722] hostname: hostname: using hostnamed
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.6725] hostname: static hostname changed from (none) to "compute-0"
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.6732] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.6739] manager[0x5607708ea090]: rfkill: Wi-Fi hardware radio set enabled
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.6740] manager[0x5607708ea090]: rfkill: WWAN hardware radio set enabled
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.6776] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.6793] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.6794] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.6795] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.6796] manager: Networking is enabled by state file
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.6799] settings: Loaded settings plugin: keyfile (internal)
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.6805] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.6846] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.6860] dhcp: init: Using DHCP client 'internal'
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.6866] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.6873] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.6885] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.6897] device (lo): Activation: starting connection 'lo' (dd23f76c-752a-4e70-b19b-d6c1272b025e)
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.6908] device (eth0): carrier: link connected
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.6914] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.6922] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.6922] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.6932] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.6943] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.6952] device (eth1): carrier: link connected
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.6958] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.6965] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (1fa3647f-a0b3-57b1-8a07-78f0592e2b89) (indicated)
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.6966] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.6975] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.6987] device (eth1): Activation: starting connection 'ci-private-network' (1fa3647f-a0b3-57b1-8a07-78f0592e2b89)
Sep 30 06:47:50 compute-0 systemd[1]: Started Network Manager.
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.6999] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.7010] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.7027] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.7030] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.7032] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.7035] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.7036] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.7038] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.7043] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.7049] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.7051] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.7071] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.7083] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.7108] dhcp4 (eth0): state changed new lease, address=38.102.83.22
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.7112] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.7174] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Sep 30 06:47:50 compute-0 sudo[51801]: pam_unix(sudo:session): session closed for user root
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.7494] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.7498] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.7503] device (lo): Activation: successful, device activated.
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.7509] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.7510] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.7514] manager: NetworkManager state is now CONNECTED_LOCAL
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.7517] device (eth1): Activation: successful, device activated.
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.7529] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.7531] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.7535] manager: NetworkManager state is now CONNECTED_SITE
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.7539] device (eth0): Activation: successful, device activated.
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.7544] manager: NetworkManager state is now CONNECTED_GLOBAL
Sep 30 06:47:50 compute-0 NetworkManager[51813]: <info>  [1759214870.7547] manager: startup complete
Sep 30 06:47:50 compute-0 systemd[1]: Starting Network Manager Wait Online...
Sep 30 06:47:50 compute-0 systemd[1]: Finished Network Manager Wait Online.
Sep 30 06:47:51 compute-0 sudo[52027]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhgtqbsafqgjyyfstlmbpxnykauwgooy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214870.9444888-316-142598641409323/AnsiballZ_dnf.py'
Sep 30 06:47:51 compute-0 sudo[52027]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:47:51 compute-0 python3.9[52029]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 06:47:54 compute-0 sshd-session[52036]: Invalid user cosmos from 152.32.253.152 port 40552
Sep 30 06:47:54 compute-0 sshd-session[52036]: Received disconnect from 152.32.253.152 port 40552:11: Bye Bye [preauth]
Sep 30 06:47:54 compute-0 sshd-session[52036]: Disconnected from invalid user cosmos 152.32.253.152 port 40552 [preauth]
Sep 30 06:47:56 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Sep 30 06:47:56 compute-0 systemd[1]: Starting man-db-cache-update.service...
Sep 30 06:47:56 compute-0 systemd[1]: Reloading.
Sep 30 06:47:56 compute-0 systemd-rc-local-generator[52083]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:47:56 compute-0 systemd-sysv-generator[52087]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 06:47:57 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Sep 30 06:47:57 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Sep 30 06:47:57 compute-0 systemd[1]: Finished man-db-cache-update.service.
Sep 30 06:47:57 compute-0 systemd[1]: run-r1471036f37e844d0871c892ce09662a9.service: Deactivated successfully.
Sep 30 06:47:57 compute-0 sudo[52027]: pam_unix(sudo:session): session closed for user root
Sep 30 06:47:58 compute-0 sudo[52491]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imnmzcupwddemmcziqftvhztynengxmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214878.2406073-340-153418594897215/AnsiballZ_stat.py'
Sep 30 06:47:58 compute-0 sudo[52491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:47:58 compute-0 python3.9[52493]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 06:47:58 compute-0 sudo[52491]: pam_unix(sudo:session): session closed for user root
Sep 30 06:47:59 compute-0 sudo[52643]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbimwxxslmlignhrtamzixqlybcerhqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214879.0488677-358-277197218497048/AnsiballZ_ini_file.py'
Sep 30 06:47:59 compute-0 sudo[52643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:47:59 compute-0 python3.9[52645]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:47:59 compute-0 sudo[52643]: pam_unix(sudo:session): session closed for user root
Sep 30 06:48:00 compute-0 sudo[52797]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgnuscdcxqzowrvtqeaxyinfrqwltkph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214880.137134-378-45966196278371/AnsiballZ_ini_file.py'
Sep 30 06:48:00 compute-0 sudo[52797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:48:00 compute-0 python3.9[52799]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:48:00 compute-0 sudo[52797]: pam_unix(sudo:session): session closed for user root
Sep 30 06:48:00 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Sep 30 06:48:01 compute-0 sudo[52949]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxiwakchdnqwhmnbexyayomihcabkeif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214880.9571753-378-260464494944094/AnsiballZ_ini_file.py'
Sep 30 06:48:01 compute-0 sudo[52949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:48:01 compute-0 python3.9[52951]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:48:01 compute-0 sudo[52949]: pam_unix(sudo:session): session closed for user root
Sep 30 06:48:01 compute-0 sudo[53101]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwljasvdzkpjceceqcuxorhikrmzaikl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214881.661095-408-166646187521781/AnsiballZ_ini_file.py'
Sep 30 06:48:01 compute-0 sudo[53101]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:48:02 compute-0 python3.9[53103]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:48:02 compute-0 sudo[53101]: pam_unix(sudo:session): session closed for user root
Sep 30 06:48:02 compute-0 sudo[53253]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vkojwkfapajujliuruqwzserxzdlifck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214882.385801-408-45862568134201/AnsiballZ_ini_file.py'
Sep 30 06:48:02 compute-0 sudo[53253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:48:02 compute-0 python3.9[53255]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:48:02 compute-0 sudo[53253]: pam_unix(sudo:session): session closed for user root
Sep 30 06:48:03 compute-0 sudo[53405]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drrmagwgrrxtoedpilsrdoniyolbkvjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214883.148791-438-71163721393594/AnsiballZ_stat.py'
Sep 30 06:48:03 compute-0 sudo[53405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:48:03 compute-0 python3.9[53407]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:48:03 compute-0 sudo[53405]: pam_unix(sudo:session): session closed for user root
Sep 30 06:48:04 compute-0 sudo[53528]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-puvmtwpwalsrvjzmkzmjcjhctlkxvxev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214883.148791-438-71163721393594/AnsiballZ_copy.py'
Sep 30 06:48:04 compute-0 sudo[53528]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:48:04 compute-0 python3.9[53530]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1759214883.148791-438-71163721393594/.source _original_basename=.4rg_2mvs follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:48:04 compute-0 sudo[53528]: pam_unix(sudo:session): session closed for user root
Sep 30 06:48:05 compute-0 sudo[53680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sswcxzqmcnulaipmkpevcfkiixmxebqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214884.8338134-468-199578006138561/AnsiballZ_file.py'
Sep 30 06:48:05 compute-0 sudo[53680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:48:05 compute-0 python3.9[53682]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:48:05 compute-0 sudo[53680]: pam_unix(sudo:session): session closed for user root
Sep 30 06:48:06 compute-0 sudo[53832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hovzzyysxyeauwyjumeapkdccdwgzlzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214885.6586118-484-70508271322687/AnsiballZ_edpm_os_net_config_mappings.py'
Sep 30 06:48:06 compute-0 sudo[53832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:48:06 compute-0 python3.9[53834]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Sep 30 06:48:06 compute-0 sudo[53832]: pam_unix(sudo:session): session closed for user root
Sep 30 06:48:07 compute-0 sudo[53984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcegtiwtzqnfrcikoufifudtocpqanji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214886.655964-502-6286308500999/AnsiballZ_file.py'
Sep 30 06:48:07 compute-0 sudo[53984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:48:07 compute-0 python3.9[53986]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:48:07 compute-0 sudo[53984]: pam_unix(sudo:session): session closed for user root
Sep 30 06:48:07 compute-0 sudo[54136]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwznquryurzxgblkgfqsinmdcxcibqdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214887.5921965-522-137007779165167/AnsiballZ_stat.py'
Sep 30 06:48:07 compute-0 sudo[54136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:48:08 compute-0 sudo[54136]: pam_unix(sudo:session): session closed for user root
Sep 30 06:48:08 compute-0 sudo[54259]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awecepztwzjarybgxbasyisvktxyurhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214887.5921965-522-137007779165167/AnsiballZ_copy.py'
Sep 30 06:48:08 compute-0 sudo[54259]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:48:08 compute-0 sudo[54259]: pam_unix(sudo:session): session closed for user root
Sep 30 06:48:09 compute-0 sudo[54411]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnrzpyhglrteadndrrlpcvgrwepwlmxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214888.9483497-552-187102001327339/AnsiballZ_slurp.py'
Sep 30 06:48:09 compute-0 sudo[54411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:48:09 compute-0 python3.9[54413]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Sep 30 06:48:09 compute-0 sudo[54411]: pam_unix(sudo:session): session closed for user root
Sep 30 06:48:10 compute-0 sudo[54586]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvmewoqoawmgzyhmsgnlypphfcivmryn ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214890.04403-570-224867434360871/async_wrapper.py j632944549114 300 /home/zuul/.ansible/tmp/ansible-tmp-1759214890.04403-570-224867434360871/AnsiballZ_edpm_os_net_config.py _'
Sep 30 06:48:10 compute-0 sudo[54586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:48:11 compute-0 ansible-async_wrapper.py[54588]: Invoked with j632944549114 300 /home/zuul/.ansible/tmp/ansible-tmp-1759214890.04403-570-224867434360871/AnsiballZ_edpm_os_net_config.py _
Sep 30 06:48:11 compute-0 ansible-async_wrapper.py[54591]: Starting module and watcher
Sep 30 06:48:11 compute-0 ansible-async_wrapper.py[54591]: Start watching 54592 (300)
Sep 30 06:48:11 compute-0 ansible-async_wrapper.py[54592]: Start module (54592)
Sep 30 06:48:11 compute-0 ansible-async_wrapper.py[54588]: Return async_wrapper task started.
Sep 30 06:48:11 compute-0 sudo[54586]: pam_unix(sudo:session): session closed for user root
Sep 30 06:48:11 compute-0 python3.9[54593]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Sep 30 06:48:11 compute-0 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Sep 30 06:48:11 compute-0 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Sep 30 06:48:11 compute-0 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Sep 30 06:48:11 compute-0 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Sep 30 06:48:11 compute-0 kernel: cfg80211: failed to load regulatory.db
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.2742] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=54594 uid=0 result="success"
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.2779] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=54594 uid=0 result="success"
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.3624] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.3626] audit: op="connection-add" uuid="7ddecf05-fc00-4800-9668-fe24d2fd5401" name="br-ex-br" pid=54594 uid=0 result="success"
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.3649] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.3650] audit: op="connection-add" uuid="c9634aba-787c-4992-97ff-dd0741260682" name="br-ex-port" pid=54594 uid=0 result="success"
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.3668] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.3670] audit: op="connection-add" uuid="b646f5cd-2637-4c67-bb49-47523dbf791f" name="eth1-port" pid=54594 uid=0 result="success"
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.3689] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.3691] audit: op="connection-add" uuid="22d3e2b7-e58c-4584-abf9-0b2297778ceb" name="vlan20-port" pid=54594 uid=0 result="success"
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.3707] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.3708] audit: op="connection-add" uuid="ccac1618-998d-42e5-9c2b-62d36514aa8d" name="vlan21-port" pid=54594 uid=0 result="success"
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.3728] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.3730] audit: op="connection-add" uuid="6a13bc11-959c-42a2-bad9-cd6b68a1adeb" name="vlan22-port" pid=54594 uid=0 result="success"
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.3757] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="ipv4.dhcp-timeout,ipv4.dhcp-client-id,connection.autoconnect-priority,connection.timestamp,ipv6.dhcp-timeout,ipv6.method,ipv6.addr-gen-mode,802-3-ethernet.mtu" pid=54594 uid=0 result="success"
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.3783] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.3785] audit: op="connection-add" uuid="42492381-262f-436d-8d86-341a5b2c65c3" name="br-ex-if" pid=54594 uid=0 result="success"
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.3894] audit: op="connection-update" uuid="1fa3647f-a0b3-57b1-8a07-78f0592e2b89" name="ci-private-network" args="ipv4.never-default,ipv4.addresses,ipv4.dns,ipv4.method,ipv4.routing-rules,ipv4.routes,ovs-external-ids.data,connection.timestamp,connection.controller,connection.master,connection.port-type,connection.slave-type,ovs-interface.type,ipv6.addresses,ipv6.dns,ipv6.method,ipv6.addr-gen-mode,ipv6.routing-rules,ipv6.routes" pid=54594 uid=0 result="success"
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.3917] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.3920] audit: op="connection-add" uuid="163356c2-3ce3-40ef-96a8-6fd68efd94fc" name="vlan20-if" pid=54594 uid=0 result="success"
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.3941] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.3942] audit: op="connection-add" uuid="8b354faf-15a0-48c1-9d17-3244c623e9b0" name="vlan21-if" pid=54594 uid=0 result="success"
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.3966] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.3968] audit: op="connection-add" uuid="6db4324d-78f6-49c1-8066-670ae91c29e1" name="vlan22-if" pid=54594 uid=0 result="success"
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.3985] audit: op="connection-delete" uuid="78674247-54a0-3097-b9a3-7d78780bff1a" name="Wired connection 1" pid=54594 uid=0 result="success"
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4000] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4010] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4014] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (7ddecf05-fc00-4800-9668-fe24d2fd5401)
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4015] audit: op="connection-activate" uuid="7ddecf05-fc00-4800-9668-fe24d2fd5401" name="br-ex-br" pid=54594 uid=0 result="success"
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4016] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4023] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4027] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (c9634aba-787c-4992-97ff-dd0741260682)
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4029] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4035] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4039] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (b646f5cd-2637-4c67-bb49-47523dbf791f)
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4041] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4048] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4053] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (22d3e2b7-e58c-4584-abf9-0b2297778ceb)
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4055] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4061] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4066] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (ccac1618-998d-42e5-9c2b-62d36514aa8d)
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4068] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4075] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4080] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (6a13bc11-959c-42a2-bad9-cd6b68a1adeb)
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4080] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4083] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4085] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4092] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4097] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4101] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (42492381-262f-436d-8d86-341a5b2c65c3)
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4102] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4105] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4107] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4108] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4110] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4122] device (eth1): disconnecting for new activation request.
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4123] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4126] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4128] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4129] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4131] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4136] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4140] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (163356c2-3ce3-40ef-96a8-6fd68efd94fc)
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4140] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4143] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4145] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4146] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4149] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4153] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4159] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (8b354faf-15a0-48c1-9d17-3244c623e9b0)
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4160] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4163] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4164] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4166] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4168] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4172] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4177] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (6db4324d-78f6-49c1-8066-670ae91c29e1)
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4178] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4180] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4182] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4183] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4185] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4202] audit: op="device-reapply" interface="eth0" ifindex=2 args="ipv4.dhcp-timeout,ipv4.dhcp-client-id,connection.autoconnect-priority,ipv6.method,ipv6.addr-gen-mode,802-3-ethernet.mtu" pid=54594 uid=0 result="success"
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4204] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4209] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4210] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4218] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4222] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4226] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4230] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4233] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 kernel: ovs-system: entered promiscuous mode
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4240] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4246] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4251] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4255] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 systemd-udevd[54598]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 06:48:13 compute-0 kernel: Timeout policy base is empty
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4262] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4268] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4273] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4276] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4283] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4289] dhcp4 (eth0): canceled DHCP transaction
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4289] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4289] dhcp4 (eth0): state changed no lease
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4291] dhcp4 (eth0): activation: beginning transaction (no timeout)
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4308] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4312] audit: op="device-reapply" interface="eth1" ifindex=3 pid=54594 uid=0 result="fail" reason="Device is not activated"
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4375] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4380] dhcp4 (eth0): state changed new lease, address=38.102.83.22
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4443] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Sep 30 06:48:13 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4454] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4458] device (eth1): disconnecting for new activation request.
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4458] audit: op="connection-activate" uuid="1fa3647f-a0b3-57b1-8a07-78f0592e2b89" name="ci-private-network" pid=54594 uid=0 result="success"
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4472] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4583] device (eth1): Activation: starting connection 'ci-private-network' (1fa3647f-a0b3-57b1-8a07-78f0592e2b89)
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4588] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4600] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4605] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4611] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4615] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4618] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4620] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4621] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4622] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=54594 uid=0 result="success"
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4623] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4624] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4628] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4633] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4636] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4639] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4643] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4647] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4650] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4654] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4657] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4660] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4664] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Sep 30 06:48:13 compute-0 kernel: br-ex: entered promiscuous mode
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4753] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4755] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 kernel: vlan22: entered promiscuous mode
Sep 30 06:48:13 compute-0 systemd-udevd[54600]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4775] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4777] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4786] device (eth1): Activation: successful, device activated.
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4802] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4815] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4837] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4839] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 kernel: vlan20: entered promiscuous mode
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4850] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Sep 30 06:48:13 compute-0 kernel: vlan21: entered promiscuous mode
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4920] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Sep 30 06:48:13 compute-0 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4947] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4963] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4972] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.4996] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.5000] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.5007] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.5011] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.5014] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.5019] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.5038] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.5058] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.5102] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.5104] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Sep 30 06:48:13 compute-0 NetworkManager[51813]: <info>  [1759214893.5110] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Sep 30 06:48:14 compute-0 NetworkManager[51813]: <info>  [1759214894.6267] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=54594 uid=0 result="success"
Sep 30 06:48:14 compute-0 sudo[54924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjsdkicndsgzbfkscigveujpccfwcznl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214894.2464635-570-183879275358470/AnsiballZ_async_status.py'
Sep 30 06:48:14 compute-0 sudo[54924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:48:14 compute-0 NetworkManager[51813]: <info>  [1759214894.8429] checkpoint[0x5607708bf950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Sep 30 06:48:14 compute-0 NetworkManager[51813]: <info>  [1759214894.8431] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=54594 uid=0 result="success"
Sep 30 06:48:15 compute-0 python3.9[54926]: ansible-ansible.legacy.async_status Invoked with jid=j632944549114.54588 mode=status _async_dir=/root/.ansible_async
Sep 30 06:48:15 compute-0 sudo[54924]: pam_unix(sudo:session): session closed for user root
Sep 30 06:48:15 compute-0 NetworkManager[51813]: <info>  [1759214895.2164] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=54594 uid=0 result="success"
Sep 30 06:48:15 compute-0 NetworkManager[51813]: <info>  [1759214895.2182] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=54594 uid=0 result="success"
Sep 30 06:48:15 compute-0 NetworkManager[51813]: <info>  [1759214895.4831] audit: op="networking-control" arg="global-dns-configuration" pid=54594 uid=0 result="success"
Sep 30 06:48:15 compute-0 NetworkManager[51813]: <info>  [1759214895.4877] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Sep 30 06:48:15 compute-0 NetworkManager[51813]: <info>  [1759214895.4973] audit: op="networking-control" arg="global-dns-configuration" pid=54594 uid=0 result="success"
Sep 30 06:48:15 compute-0 NetworkManager[51813]: <info>  [1759214895.5003] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=54594 uid=0 result="success"
Sep 30 06:48:15 compute-0 NetworkManager[51813]: <info>  [1759214895.7175] checkpoint[0x5607708bfa20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Sep 30 06:48:15 compute-0 NetworkManager[51813]: <info>  [1759214895.7181] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=54594 uid=0 result="success"
Sep 30 06:48:15 compute-0 ansible-async_wrapper.py[54592]: Module complete (54592)
Sep 30 06:48:16 compute-0 ansible-async_wrapper.py[54591]: Done in kid B.
Sep 30 06:48:18 compute-0 sudo[55031]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qlucxtjbfumszphqyxglzmfoupejaiqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214894.2464635-570-183879275358470/AnsiballZ_async_status.py'
Sep 30 06:48:18 compute-0 sudo[55031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:48:18 compute-0 python3.9[55033]: ansible-ansible.legacy.async_status Invoked with jid=j632944549114.54588 mode=status _async_dir=/root/.ansible_async
Sep 30 06:48:18 compute-0 sudo[55031]: pam_unix(sudo:session): session closed for user root
Sep 30 06:48:19 compute-0 sudo[55130]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vshhseclnztyyhjjvytrytqeuzfrfrik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214894.2464635-570-183879275358470/AnsiballZ_async_status.py'
Sep 30 06:48:19 compute-0 sudo[55130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:48:19 compute-0 python3.9[55132]: ansible-ansible.legacy.async_status Invoked with jid=j632944549114.54588 mode=cleanup _async_dir=/root/.ansible_async
Sep 30 06:48:19 compute-0 sudo[55130]: pam_unix(sudo:session): session closed for user root
Sep 30 06:48:19 compute-0 sudo[55282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-llvhnudusyqzeamvqlhnxfuifcewimoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214899.4902-624-41584918520596/AnsiballZ_stat.py'
Sep 30 06:48:19 compute-0 sudo[55282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:48:20 compute-0 python3.9[55284]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:48:20 compute-0 sudo[55282]: pam_unix(sudo:session): session closed for user root
Sep 30 06:48:20 compute-0 sudo[55405]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkmittcridccrgltdmlukocccomxmaip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214899.4902-624-41584918520596/AnsiballZ_copy.py'
Sep 30 06:48:20 compute-0 sudo[55405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:48:20 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Sep 30 06:48:20 compute-0 python3.9[55407]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759214899.4902-624-41584918520596/.source.returncode _original_basename=.xa0xfcb_ follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:48:20 compute-0 sudo[55405]: pam_unix(sudo:session): session closed for user root
Sep 30 06:48:21 compute-0 sudo[55559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwznljmbpgwvvnciafdmzwcxcsmpnnyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214901.1168044-656-145379620045233/AnsiballZ_stat.py'
Sep 30 06:48:21 compute-0 sudo[55559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:48:21 compute-0 python3.9[55561]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:48:21 compute-0 sudo[55559]: pam_unix(sudo:session): session closed for user root
Sep 30 06:48:22 compute-0 sudo[55683]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytjciumxqccxasubaurwfxwnmajsxcav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214901.1168044-656-145379620045233/AnsiballZ_copy.py'
Sep 30 06:48:22 compute-0 sudo[55683]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:48:22 compute-0 python3.9[55685]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759214901.1168044-656-145379620045233/.source.cfg _original_basename=.x0lcmqik follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:48:22 compute-0 sudo[55683]: pam_unix(sudo:session): session closed for user root
Sep 30 06:48:23 compute-0 sudo[55835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lndxtikoklpiiwzwocfdysbqlhisrtlr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214902.663505-686-204628372141391/AnsiballZ_systemd.py'
Sep 30 06:48:23 compute-0 sudo[55835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:48:23 compute-0 python3.9[55837]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 06:48:23 compute-0 systemd[1]: Reloading Network Manager...
Sep 30 06:48:23 compute-0 NetworkManager[51813]: <info>  [1759214903.4850] audit: op="reload" arg="0" pid=55841 uid=0 result="success"
Sep 30 06:48:23 compute-0 NetworkManager[51813]: <info>  [1759214903.4860] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Sep 30 06:48:23 compute-0 systemd[1]: Reloaded Network Manager.
Sep 30 06:48:23 compute-0 sudo[55835]: pam_unix(sudo:session): session closed for user root
Sep 30 06:48:23 compute-0 sshd-session[47817]: Connection closed by 192.168.122.30 port 41042
Sep 30 06:48:23 compute-0 sshd-session[47814]: pam_unix(sshd:session): session closed for user zuul
Sep 30 06:48:23 compute-0 systemd[1]: session-13.scope: Deactivated successfully.
Sep 30 06:48:23 compute-0 systemd[1]: session-13.scope: Consumed 53.550s CPU time.
Sep 30 06:48:23 compute-0 systemd-logind[824]: Session 13 logged out. Waiting for processes to exit.
Sep 30 06:48:23 compute-0 systemd-logind[824]: Removed session 13.
Sep 30 06:48:29 compute-0 sshd-session[55872]: Accepted publickey for zuul from 192.168.122.30 port 42696 ssh2: ECDSA SHA256:VgXY+3KEFg6ByVjpOVk/qpSKqXtLqTtx1W0gQMfs9wE
Sep 30 06:48:29 compute-0 systemd-logind[824]: New session 14 of user zuul.
Sep 30 06:48:29 compute-0 systemd[1]: Started Session 14 of User zuul.
Sep 30 06:48:29 compute-0 sshd-session[55872]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 06:48:30 compute-0 python3.9[56025]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 06:48:31 compute-0 python3.9[56180]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 06:48:33 compute-0 python3.9[56369]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:48:33 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Sep 30 06:48:33 compute-0 sshd-session[55875]: Connection closed by 192.168.122.30 port 42696
Sep 30 06:48:33 compute-0 sshd-session[55872]: pam_unix(sshd:session): session closed for user zuul
Sep 30 06:48:33 compute-0 systemd[1]: session-14.scope: Deactivated successfully.
Sep 30 06:48:33 compute-0 systemd[1]: session-14.scope: Consumed 2.667s CPU time.
Sep 30 06:48:33 compute-0 systemd-logind[824]: Session 14 logged out. Waiting for processes to exit.
Sep 30 06:48:33 compute-0 systemd-logind[824]: Removed session 14.
Sep 30 06:48:39 compute-0 sshd-session[56398]: Accepted publickey for zuul from 192.168.122.30 port 56562 ssh2: ECDSA SHA256:VgXY+3KEFg6ByVjpOVk/qpSKqXtLqTtx1W0gQMfs9wE
Sep 30 06:48:39 compute-0 systemd-logind[824]: New session 15 of user zuul.
Sep 30 06:48:39 compute-0 systemd[1]: Started Session 15 of User zuul.
Sep 30 06:48:39 compute-0 sshd-session[56398]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 06:48:40 compute-0 python3.9[56551]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 06:48:41 compute-0 python3.9[56706]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 06:48:42 compute-0 sudo[56860]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbyebyvnhzclowhhoqywkcfxhxkcgzsu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214922.0480425-60-269475030994165/AnsiballZ_setup.py'
Sep 30 06:48:42 compute-0 sudo[56860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:48:42 compute-0 python3.9[56862]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 06:48:43 compute-0 sudo[56860]: pam_unix(sudo:session): session closed for user root
Sep 30 06:48:43 compute-0 sudo[56944]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uxklbydmymdlahseuabqswrfppqevwhz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214922.0480425-60-269475030994165/AnsiballZ_dnf.py'
Sep 30 06:48:43 compute-0 sudo[56944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:48:43 compute-0 python3.9[56946]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 06:48:44 compute-0 sudo[56944]: pam_unix(sudo:session): session closed for user root
Sep 30 06:48:45 compute-0 sudo[57098]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfmxdhfqdjjrhyqgwszkamgxyfcyioap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214925.1782424-84-171064829913015/AnsiballZ_setup.py'
Sep 30 06:48:45 compute-0 sudo[57098]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:48:45 compute-0 python3.9[57100]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 06:48:46 compute-0 sudo[57098]: pam_unix(sudo:session): session closed for user root
Sep 30 06:48:46 compute-0 sudo[57289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmejhamoqlitwvxfkeqpxtnaenxtmiya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214926.4336803-106-272603604016358/AnsiballZ_file.py'
Sep 30 06:48:46 compute-0 sudo[57289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:48:47 compute-0 python3.9[57291]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:48:47 compute-0 sudo[57289]: pam_unix(sudo:session): session closed for user root
Sep 30 06:48:47 compute-0 sudo[57441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npaszjzwcuaswkziudrkutuditfqvgvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214927.3890636-122-235640017875193/AnsiballZ_command.py'
Sep 30 06:48:47 compute-0 sudo[57441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:48:48 compute-0 python3.9[57443]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:48:48 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 06:48:48 compute-0 sudo[57441]: pam_unix(sudo:session): session closed for user root
Sep 30 06:48:48 compute-0 sudo[57607]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mfkottiewsevzpbykjwcfwgysednvgpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214928.4405942-138-41170376746467/AnsiballZ_stat.py'
Sep 30 06:48:48 compute-0 sudo[57607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:48:49 compute-0 sshd-session[57480]: Received disconnect from 91.224.92.28 port 27552:11:  [preauth]
Sep 30 06:48:49 compute-0 sshd-session[57480]: Disconnected from authenticating user root 91.224.92.28 port 27552 [preauth]
Sep 30 06:48:49 compute-0 python3.9[57609]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:48:49 compute-0 sudo[57607]: pam_unix(sudo:session): session closed for user root
Sep 30 06:48:49 compute-0 sudo[57685]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uiuwgsnemwjpiqnrtopbcmfttcumnkht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214928.4405942-138-41170376746467/AnsiballZ_file.py'
Sep 30 06:48:49 compute-0 sudo[57685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:48:49 compute-0 python3.9[57687]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:48:49 compute-0 sudo[57685]: pam_unix(sudo:session): session closed for user root
Sep 30 06:48:50 compute-0 sudo[57837]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rjizhumlwiyapyxgwsjeaqdjfeeukidd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214929.9204428-162-124331378315703/AnsiballZ_stat.py'
Sep 30 06:48:50 compute-0 sudo[57837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:48:50 compute-0 python3.9[57839]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:48:50 compute-0 sudo[57837]: pam_unix(sudo:session): session closed for user root
Sep 30 06:48:50 compute-0 sudo[57915]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rynjuznjqcfiudbqlqjqlywmkfkojhbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214929.9204428-162-124331378315703/AnsiballZ_file.py'
Sep 30 06:48:50 compute-0 sudo[57915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:48:51 compute-0 python3.9[57917]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:48:51 compute-0 sudo[57915]: pam_unix(sudo:session): session closed for user root
Sep 30 06:48:51 compute-0 sudo[58067]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spcxppgxvmwakotgqtnufpgkzsgjaqef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214931.4220703-188-88047817107805/AnsiballZ_ini_file.py'
Sep 30 06:48:51 compute-0 sudo[58067]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:48:52 compute-0 python3.9[58069]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:48:52 compute-0 sudo[58067]: pam_unix(sudo:session): session closed for user root
Sep 30 06:48:52 compute-0 sudo[58219]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dlaculwojakxlpatqkdxvdqbsqvorlew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214932.3422532-188-159407994853766/AnsiballZ_ini_file.py'
Sep 30 06:48:52 compute-0 sudo[58219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:48:52 compute-0 python3.9[58221]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:48:52 compute-0 sudo[58219]: pam_unix(sudo:session): session closed for user root
Sep 30 06:48:53 compute-0 sudo[58371]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-haebavereyjxjyxsjhrjtfvdajyxukvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214933.055392-188-190375970049986/AnsiballZ_ini_file.py'
Sep 30 06:48:53 compute-0 sudo[58371]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:48:53 compute-0 python3.9[58373]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:48:53 compute-0 sudo[58371]: pam_unix(sudo:session): session closed for user root
Sep 30 06:48:54 compute-0 sudo[58523]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jliqfmaadughulvgubkpcgskdtvllytf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214933.8250294-188-121828140620935/AnsiballZ_ini_file.py'
Sep 30 06:48:54 compute-0 sudo[58523]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:48:54 compute-0 python3.9[58525]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:48:54 compute-0 sudo[58523]: pam_unix(sudo:session): session closed for user root
Sep 30 06:48:55 compute-0 sudo[58675]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bkxkxsevvxoshjjnhywxfipovaiwnsrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214934.7187815-250-95531620356816/AnsiballZ_dnf.py'
Sep 30 06:48:55 compute-0 sudo[58675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:48:55 compute-0 python3.9[58677]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 06:48:56 compute-0 sudo[58675]: pam_unix(sudo:session): session closed for user root
Sep 30 06:48:57 compute-0 sudo[58828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agavnjddxwgdovpqpeolhhqyjsdmzsev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214937.0347807-272-225313588347584/AnsiballZ_setup.py'
Sep 30 06:48:57 compute-0 sudo[58828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:48:57 compute-0 python3.9[58830]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 06:48:57 compute-0 sudo[58828]: pam_unix(sudo:session): session closed for user root
Sep 30 06:48:58 compute-0 sudo[58984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohklwxfigcdnpnttcoxafuztevmzzkuh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214937.9525354-288-273593077794974/AnsiballZ_stat.py'
Sep 30 06:48:58 compute-0 sudo[58984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:48:58 compute-0 python3.9[58986]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 06:48:58 compute-0 sudo[58984]: pam_unix(sudo:session): session closed for user root
Sep 30 06:48:59 compute-0 sudo[59136]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbsgnhxgesfyufeuknfhljnupxxffkgy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214938.7005866-306-29663597941391/AnsiballZ_stat.py'
Sep 30 06:48:59 compute-0 sudo[59136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:48:59 compute-0 sshd-session[58930]: Invalid user minecraft from 152.32.253.152 port 35868
Sep 30 06:48:59 compute-0 python3.9[59138]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 06:48:59 compute-0 sudo[59136]: pam_unix(sudo:session): session closed for user root
Sep 30 06:48:59 compute-0 sshd-session[58930]: Received disconnect from 152.32.253.152 port 35868:11: Bye Bye [preauth]
Sep 30 06:48:59 compute-0 sshd-session[58930]: Disconnected from invalid user minecraft 152.32.253.152 port 35868 [preauth]
Sep 30 06:49:00 compute-0 sudo[59288]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmphzhmvvsrrzhzhcszzuilpxdhjflhh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214939.5777144-326-140960814310901/AnsiballZ_service_facts.py'
Sep 30 06:49:00 compute-0 sudo[59288]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:49:00 compute-0 python3.9[59290]: ansible-service_facts Invoked
Sep 30 06:49:00 compute-0 network[59307]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Sep 30 06:49:00 compute-0 network[59308]: 'network-scripts' will be removed from distribution in near future.
Sep 30 06:49:00 compute-0 network[59309]: It is advised to switch to 'NetworkManager' instead for network management.
Sep 30 06:49:04 compute-0 sudo[59288]: pam_unix(sudo:session): session closed for user root
Sep 30 06:49:06 compute-0 sudo[59594]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojtlptdsxzrcgolpjzkzahslpyceltmv ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1759214946.079589-352-16355504806966/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1759214946.079589-352-16355504806966/args'
Sep 30 06:49:06 compute-0 sudo[59594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:49:06 compute-0 sudo[59594]: pam_unix(sudo:session): session closed for user root
Sep 30 06:49:07 compute-0 sudo[59761]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzylyhwoumyszdbvequuaopifewjjhdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214946.9932601-374-50319023225108/AnsiballZ_dnf.py'
Sep 30 06:49:07 compute-0 sudo[59761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:49:07 compute-0 python3.9[59763]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 06:49:08 compute-0 sudo[59761]: pam_unix(sudo:session): session closed for user root
Sep 30 06:49:10 compute-0 sudo[59914]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnxxesvfbihndaojumjplxewvbuehcji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214949.3049648-400-42469913222847/AnsiballZ_package_facts.py'
Sep 30 06:49:10 compute-0 sudo[59914]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:49:10 compute-0 python3.9[59916]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Sep 30 06:49:10 compute-0 sudo[59914]: pam_unix(sudo:session): session closed for user root
Sep 30 06:49:11 compute-0 sudo[60066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-anhbqybzuvcxspcqlworhgmbwnpltvcu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214951.2843206-420-30831568719806/AnsiballZ_stat.py'
Sep 30 06:49:11 compute-0 sudo[60066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:49:11 compute-0 python3.9[60068]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:49:11 compute-0 sudo[60066]: pam_unix(sudo:session): session closed for user root
Sep 30 06:49:12 compute-0 sudo[60191]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-thiyfnegfpmyiwcfasiusosewvjqdyvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214951.2843206-420-30831568719806/AnsiballZ_copy.py'
Sep 30 06:49:12 compute-0 sudo[60191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:49:12 compute-0 python3.9[60193]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759214951.2843206-420-30831568719806/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:49:12 compute-0 sudo[60191]: pam_unix(sudo:session): session closed for user root
Sep 30 06:49:13 compute-0 sudo[60345]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldipijsvwipoechyexmdbvrpbxtkaxfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214953.1164134-450-280638935990217/AnsiballZ_stat.py'
Sep 30 06:49:13 compute-0 sudo[60345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:49:13 compute-0 python3.9[60347]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:49:13 compute-0 sudo[60345]: pam_unix(sudo:session): session closed for user root
Sep 30 06:49:14 compute-0 sudo[60470]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mftnvefgmfqwnkfihukpqwliukaqagrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214953.1164134-450-280638935990217/AnsiballZ_copy.py'
Sep 30 06:49:14 compute-0 sudo[60470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:49:14 compute-0 python3.9[60472]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759214953.1164134-450-280638935990217/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:49:14 compute-0 sudo[60470]: pam_unix(sudo:session): session closed for user root
Sep 30 06:49:15 compute-0 sudo[60624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtqexcczlzobngyphlewuqjjffiqrylo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214955.09346-492-128995709337851/AnsiballZ_lineinfile.py'
Sep 30 06:49:15 compute-0 sudo[60624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:49:15 compute-0 python3.9[60626]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:49:15 compute-0 sudo[60624]: pam_unix(sudo:session): session closed for user root
Sep 30 06:49:17 compute-0 sudo[60778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezphoockmwfmxzfgzydtlknrlszfeamb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214956.7050471-522-145517168158332/AnsiballZ_setup.py'
Sep 30 06:49:17 compute-0 sudo[60778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:49:17 compute-0 python3.9[60780]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 06:49:17 compute-0 sudo[60778]: pam_unix(sudo:session): session closed for user root
Sep 30 06:49:18 compute-0 sudo[60862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgzvowkweghfujxqvocsuokiyoddstcv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214956.7050471-522-145517168158332/AnsiballZ_systemd.py'
Sep 30 06:49:18 compute-0 sudo[60862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:49:18 compute-0 python3.9[60864]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 06:49:18 compute-0 sudo[60862]: pam_unix(sudo:session): session closed for user root
Sep 30 06:49:19 compute-0 sudo[61016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvousqgypfityeduqccnmkeyukhqompi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214959.1601915-554-10102187138527/AnsiballZ_setup.py'
Sep 30 06:49:19 compute-0 sudo[61016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:49:19 compute-0 python3.9[61018]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 06:49:20 compute-0 sudo[61016]: pam_unix(sudo:session): session closed for user root
Sep 30 06:49:20 compute-0 sudo[61100]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsikltwfwxobesffobsobiabtwilsxem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214959.1601915-554-10102187138527/AnsiballZ_systemd.py'
Sep 30 06:49:20 compute-0 sudo[61100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:49:20 compute-0 python3.9[61102]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 06:49:20 compute-0 chronyd[831]: chronyd exiting
Sep 30 06:49:20 compute-0 systemd[1]: Stopping NTP client/server...
Sep 30 06:49:20 compute-0 systemd[1]: chronyd.service: Deactivated successfully.
Sep 30 06:49:20 compute-0 systemd[1]: Stopped NTP client/server.
Sep 30 06:49:20 compute-0 systemd[1]: Starting NTP client/server...
Sep 30 06:49:20 compute-0 chronyd[61110]: chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Sep 30 06:49:20 compute-0 chronyd[61110]: Frequency -28.238 +/- 0.620 ppm read from /var/lib/chrony/drift
Sep 30 06:49:20 compute-0 chronyd[61110]: Loaded seccomp filter (level 2)
Sep 30 06:49:20 compute-0 systemd[1]: Started NTP client/server.
Sep 30 06:49:21 compute-0 sudo[61100]: pam_unix(sudo:session): session closed for user root
Sep 30 06:49:21 compute-0 sshd-session[56401]: Connection closed by 192.168.122.30 port 56562
Sep 30 06:49:21 compute-0 sshd-session[56398]: pam_unix(sshd:session): session closed for user zuul
Sep 30 06:49:21 compute-0 systemd[1]: session-15.scope: Deactivated successfully.
Sep 30 06:49:21 compute-0 systemd[1]: session-15.scope: Consumed 28.518s CPU time.
Sep 30 06:49:21 compute-0 systemd-logind[824]: Session 15 logged out. Waiting for processes to exit.
Sep 30 06:49:21 compute-0 systemd-logind[824]: Removed session 15.
Sep 30 06:49:27 compute-0 sshd-session[61136]: Accepted publickey for zuul from 192.168.122.30 port 52074 ssh2: ECDSA SHA256:VgXY+3KEFg6ByVjpOVk/qpSKqXtLqTtx1W0gQMfs9wE
Sep 30 06:49:27 compute-0 systemd-logind[824]: New session 16 of user zuul.
Sep 30 06:49:27 compute-0 systemd[1]: Started Session 16 of User zuul.
Sep 30 06:49:27 compute-0 sshd-session[61136]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 06:49:28 compute-0 python3.9[61289]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 06:49:30 compute-0 sudo[61443]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddkvdqrxdtydcvgoexdhklzzzktztmew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214969.3680434-46-9224202257973/AnsiballZ_file.py'
Sep 30 06:49:30 compute-0 sudo[61443]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:49:30 compute-0 python3.9[61445]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:49:30 compute-0 sudo[61443]: pam_unix(sudo:session): session closed for user root
Sep 30 06:49:31 compute-0 sudo[61618]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmuarwddeupwokikbemwvuzkyjvjrloh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214970.5005863-62-81858477008743/AnsiballZ_stat.py'
Sep 30 06:49:31 compute-0 sudo[61618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:49:31 compute-0 python3.9[61620]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:49:31 compute-0 sudo[61618]: pam_unix(sudo:session): session closed for user root
Sep 30 06:49:31 compute-0 sudo[61696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eutriiulirvzqvisrgwglixoljwhvksn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214970.5005863-62-81858477008743/AnsiballZ_file.py'
Sep 30 06:49:31 compute-0 sudo[61696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:49:31 compute-0 python3.9[61698]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.1zycz68v recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:49:31 compute-0 sudo[61696]: pam_unix(sudo:session): session closed for user root
Sep 30 06:49:32 compute-0 sudo[61848]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzyxxsmiskzhdlqsrmwfmgqoncsfbdsw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214972.4625623-102-242729783796524/AnsiballZ_stat.py'
Sep 30 06:49:32 compute-0 sudo[61848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:49:33 compute-0 python3.9[61850]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:49:33 compute-0 sudo[61848]: pam_unix(sudo:session): session closed for user root
Sep 30 06:49:33 compute-0 sudo[61971]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lpfcxntlmdpsnxfxgmllaeccpcdxpocw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214972.4625623-102-242729783796524/AnsiballZ_copy.py'
Sep 30 06:49:33 compute-0 sudo[61971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:49:34 compute-0 python3.9[61973]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759214972.4625623-102-242729783796524/.source _original_basename=.6vcj2m5b follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:49:34 compute-0 sudo[61971]: pam_unix(sudo:session): session closed for user root
Sep 30 06:49:34 compute-0 sudo[62123]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osfnnjnycatzqtszittsxeszmlbqdzdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214974.2621553-134-245036250973938/AnsiballZ_file.py'
Sep 30 06:49:34 compute-0 sudo[62123]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:49:34 compute-0 python3.9[62125]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:49:34 compute-0 sudo[62123]: pam_unix(sudo:session): session closed for user root
Sep 30 06:49:35 compute-0 sudo[62275]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egmyqexuxcfhoqpkmcllplungmzplday ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214975.104052-150-72404222433490/AnsiballZ_stat.py'
Sep 30 06:49:35 compute-0 sudo[62275]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:49:35 compute-0 python3.9[62277]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:49:35 compute-0 sudo[62275]: pam_unix(sudo:session): session closed for user root
Sep 30 06:49:36 compute-0 sudo[62398]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilvjxmkedalqrqukaqmevfiadvkrxkrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214975.104052-150-72404222433490/AnsiballZ_copy.py'
Sep 30 06:49:36 compute-0 sudo[62398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:49:36 compute-0 python3.9[62400]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759214975.104052-150-72404222433490/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:49:36 compute-0 sudo[62398]: pam_unix(sudo:session): session closed for user root
Sep 30 06:49:36 compute-0 sudo[62550]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iynellfupxlrksvqjvfwlfadisgmlfxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214976.6021478-150-24368290259663/AnsiballZ_stat.py'
Sep 30 06:49:36 compute-0 sudo[62550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:49:37 compute-0 python3.9[62552]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:49:37 compute-0 sudo[62550]: pam_unix(sudo:session): session closed for user root
Sep 30 06:49:37 compute-0 sudo[62673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbdtrfmmxgicsmypxmbfomjtlnnszcfl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214976.6021478-150-24368290259663/AnsiballZ_copy.py'
Sep 30 06:49:37 compute-0 sudo[62673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:49:38 compute-0 python3.9[62675]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759214976.6021478-150-24368290259663/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:49:38 compute-0 sudo[62673]: pam_unix(sudo:session): session closed for user root
Sep 30 06:49:38 compute-0 sudo[62825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kfhjabloimyudrdsanmblpqojssspkxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214978.231332-208-261717231430520/AnsiballZ_file.py'
Sep 30 06:49:38 compute-0 sudo[62825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:49:38 compute-0 python3.9[62827]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:49:38 compute-0 sudo[62825]: pam_unix(sudo:session): session closed for user root
Sep 30 06:49:39 compute-0 sudo[62977]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ecczocxlrmhyaytjoexfqmbbgddvgzzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214979.0581062-224-246823764327587/AnsiballZ_stat.py'
Sep 30 06:49:39 compute-0 sudo[62977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:49:39 compute-0 python3.9[62979]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:49:39 compute-0 sudo[62977]: pam_unix(sudo:session): session closed for user root
Sep 30 06:49:40 compute-0 sudo[63100]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbdgyjelvrwtuzyldqqjyxjqbojvimrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214979.0581062-224-246823764327587/AnsiballZ_copy.py'
Sep 30 06:49:40 compute-0 sudo[63100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:49:40 compute-0 python3.9[63102]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759214979.0581062-224-246823764327587/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:49:40 compute-0 sudo[63100]: pam_unix(sudo:session): session closed for user root
Sep 30 06:49:41 compute-0 sudo[63252]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztxsuizryrondsylmqetyudpfuhvpgle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214980.7027204-254-30766985806718/AnsiballZ_stat.py'
Sep 30 06:49:41 compute-0 sudo[63252]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:49:41 compute-0 python3.9[63254]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:49:41 compute-0 sudo[63252]: pam_unix(sudo:session): session closed for user root
Sep 30 06:49:41 compute-0 sudo[63375]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzsiqddujgrnqgjgfdvwjbqcrehxiepe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214980.7027204-254-30766985806718/AnsiballZ_copy.py'
Sep 30 06:49:41 compute-0 sudo[63375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:49:41 compute-0 python3.9[63377]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759214980.7027204-254-30766985806718/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:49:42 compute-0 sudo[63375]: pam_unix(sudo:session): session closed for user root
Sep 30 06:49:42 compute-0 sudo[63527]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fulnxumervsybllsoceuhxmydyhzfrlh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214982.2134185-284-70975835676855/AnsiballZ_systemd.py'
Sep 30 06:49:42 compute-0 sudo[63527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:49:43 compute-0 python3.9[63529]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 06:49:43 compute-0 systemd[1]: Reloading.
Sep 30 06:49:43 compute-0 systemd-sysv-generator[63559]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 06:49:43 compute-0 systemd-rc-local-generator[63553]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:49:43 compute-0 systemd[1]: Reloading.
Sep 30 06:49:43 compute-0 systemd-rc-local-generator[63589]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:49:43 compute-0 systemd-sysv-generator[63597]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 06:49:43 compute-0 systemd[1]: Starting EDPM Container Shutdown...
Sep 30 06:49:43 compute-0 systemd[1]: Finished EDPM Container Shutdown.
Sep 30 06:49:43 compute-0 sudo[63527]: pam_unix(sudo:session): session closed for user root
Sep 30 06:49:44 compute-0 sudo[63753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojuiphssuuxuniabjjlyddzfgcpdssxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214984.0967188-300-279376713030461/AnsiballZ_stat.py'
Sep 30 06:49:44 compute-0 sudo[63753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:49:44 compute-0 python3.9[63755]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:49:44 compute-0 sudo[63753]: pam_unix(sudo:session): session closed for user root
Sep 30 06:49:45 compute-0 sudo[63876]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pajavlmeqgjvattgqgchelaahpviimhm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214984.0967188-300-279376713030461/AnsiballZ_copy.py'
Sep 30 06:49:45 compute-0 sudo[63876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:49:45 compute-0 python3.9[63878]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759214984.0967188-300-279376713030461/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:49:45 compute-0 sudo[63876]: pam_unix(sudo:session): session closed for user root
Sep 30 06:49:45 compute-0 sudo[64028]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzhscrwmtqtofblvbctncqxbtlmssdax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214985.4772377-330-154094824991875/AnsiballZ_stat.py'
Sep 30 06:49:45 compute-0 sudo[64028]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:49:46 compute-0 python3.9[64030]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:49:46 compute-0 sudo[64028]: pam_unix(sudo:session): session closed for user root
Sep 30 06:49:46 compute-0 sudo[64151]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ablefyhjrjaztcviyrgcimgvbrnqvhme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214985.4772377-330-154094824991875/AnsiballZ_copy.py'
Sep 30 06:49:46 compute-0 sudo[64151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:49:46 compute-0 python3.9[64153]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759214985.4772377-330-154094824991875/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:49:46 compute-0 sudo[64151]: pam_unix(sudo:session): session closed for user root
Sep 30 06:49:47 compute-0 sudo[64303]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owqokgfvqawrvnuwvrsrwuftwvuxpqbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214987.0830736-360-179224496463375/AnsiballZ_systemd.py'
Sep 30 06:49:47 compute-0 sudo[64303]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:49:47 compute-0 python3.9[64305]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 06:49:47 compute-0 systemd[1]: Reloading.
Sep 30 06:49:47 compute-0 systemd-rc-local-generator[64335]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:49:47 compute-0 systemd-sysv-generator[64338]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 06:49:48 compute-0 systemd[1]: Reloading.
Sep 30 06:49:48 compute-0 systemd-rc-local-generator[64374]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:49:48 compute-0 systemd-sysv-generator[64379]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 06:49:48 compute-0 systemd[1]: Starting Create netns directory...
Sep 30 06:49:48 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Sep 30 06:49:48 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Sep 30 06:49:48 compute-0 systemd[1]: Finished Create netns directory.
Sep 30 06:49:48 compute-0 sudo[64303]: pam_unix(sudo:session): session closed for user root
Sep 30 06:49:49 compute-0 python3.9[64534]: ansible-ansible.builtin.service_facts Invoked
Sep 30 06:49:49 compute-0 network[64551]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Sep 30 06:49:49 compute-0 network[64552]: 'network-scripts' will be removed from distribution in near future.
Sep 30 06:49:49 compute-0 network[64553]: It is advised to switch to 'NetworkManager' instead for network management.
Sep 30 06:49:56 compute-0 sudo[64815]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwjhjhxzrydicmxlinoftcmealwgribl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214995.828186-392-65954695591158/AnsiballZ_systemd.py'
Sep 30 06:49:56 compute-0 sudo[64815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:49:56 compute-0 python3.9[64817]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 06:49:56 compute-0 systemd[1]: Reloading.
Sep 30 06:49:56 compute-0 systemd-sysv-generator[64846]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 06:49:56 compute-0 systemd-rc-local-generator[64842]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:49:56 compute-0 systemd[1]: Stopping IPv4 firewall with iptables...
Sep 30 06:49:57 compute-0 iptables.init[64858]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Sep 30 06:49:57 compute-0 iptables.init[64858]: iptables: Flushing firewall rules: [  OK  ]
Sep 30 06:49:57 compute-0 systemd[1]: iptables.service: Deactivated successfully.
Sep 30 06:49:57 compute-0 systemd[1]: Stopped IPv4 firewall with iptables.
Sep 30 06:49:57 compute-0 sudo[64815]: pam_unix(sudo:session): session closed for user root
Sep 30 06:49:57 compute-0 sudo[65054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vptcxeehqbbjwizqkdxcjginfndsfmca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214997.3801446-392-269785196940176/AnsiballZ_systemd.py'
Sep 30 06:49:57 compute-0 sudo[65054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:49:58 compute-0 python3.9[65056]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 06:49:58 compute-0 sudo[65054]: pam_unix(sudo:session): session closed for user root
Sep 30 06:49:58 compute-0 sudo[65208]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdizwwxagokqggxvjpiwykshhgpgbxlf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214998.548068-424-159208630604197/AnsiballZ_systemd.py'
Sep 30 06:49:58 compute-0 sudo[65208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:49:59 compute-0 python3.9[65210]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 06:49:59 compute-0 systemd[1]: Reloading.
Sep 30 06:49:59 compute-0 systemd-rc-local-generator[65239]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:49:59 compute-0 systemd-sysv-generator[65243]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 06:49:59 compute-0 systemd[1]: Starting Netfilter Tables...
Sep 30 06:49:59 compute-0 systemd[1]: Finished Netfilter Tables.
Sep 30 06:49:59 compute-0 sudo[65208]: pam_unix(sudo:session): session closed for user root
Sep 30 06:50:00 compute-0 sudo[65403]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evqfcyavmcesckbiukfejobvcwuqnycl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759214999.8616173-440-22597237459017/AnsiballZ_command.py'
Sep 30 06:50:00 compute-0 sudo[65403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:50:00 compute-0 python3.9[65405]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:50:00 compute-0 sudo[65403]: pam_unix(sudo:session): session closed for user root
Sep 30 06:50:00 compute-0 sshd-session[65276]: Invalid user user1 from 152.32.253.152 port 59414
Sep 30 06:50:01 compute-0 sshd-session[65276]: Received disconnect from 152.32.253.152 port 59414:11: Bye Bye [preauth]
Sep 30 06:50:01 compute-0 sshd-session[65276]: Disconnected from invalid user user1 152.32.253.152 port 59414 [preauth]
Sep 30 06:50:01 compute-0 sudo[65556]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzatkbrvdhbdepikzwreqwbybhxbmwpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215001.1960075-468-276641158281950/AnsiballZ_stat.py'
Sep 30 06:50:01 compute-0 sudo[65556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:50:01 compute-0 python3.9[65558]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:50:01 compute-0 sudo[65556]: pam_unix(sudo:session): session closed for user root
Sep 30 06:50:02 compute-0 sudo[65681]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iemshumfswosmvumelfdowwmwvjwhcvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215001.1960075-468-276641158281950/AnsiballZ_copy.py'
Sep 30 06:50:02 compute-0 sudo[65681]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:50:02 compute-0 python3.9[65683]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759215001.1960075-468-276641158281950/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=4729b6ffc5b555fa142bf0b6e6dc15609cb89a22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:50:02 compute-0 sudo[65681]: pam_unix(sudo:session): session closed for user root
Sep 30 06:50:03 compute-0 sudo[65834]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rtnnjxlwbnxbwkgtclkjrwmqkwqzzibf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215002.7465248-500-140004632407459/AnsiballZ_file.py'
Sep 30 06:50:03 compute-0 sudo[65834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:50:03 compute-0 python3.9[65836]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:50:03 compute-0 sudo[65834]: pam_unix(sudo:session): session closed for user root
Sep 30 06:50:03 compute-0 sudo[65986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibkznzxvvsabgqpowseqjpkxahhgekhm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215003.501433-516-243969048632427/AnsiballZ_stat.py'
Sep 30 06:50:03 compute-0 sudo[65986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:50:04 compute-0 python3.9[65988]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:50:04 compute-0 sudo[65986]: pam_unix(sudo:session): session closed for user root
Sep 30 06:50:04 compute-0 sudo[66109]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grxsmwgdnwdpywgjnweuntnglsyxmfph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215003.501433-516-243969048632427/AnsiballZ_copy.py'
Sep 30 06:50:04 compute-0 sudo[66109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:50:04 compute-0 python3.9[66111]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759215003.501433-516-243969048632427/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:50:04 compute-0 sudo[66109]: pam_unix(sudo:session): session closed for user root
Sep 30 06:50:05 compute-0 sudo[66261]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-peqtxtqcruyjwqllqlnypukjtxnysfry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215005.0936587-552-99722592243765/AnsiballZ_timezone.py'
Sep 30 06:50:05 compute-0 sudo[66261]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:50:05 compute-0 python3.9[66263]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Sep 30 06:50:05 compute-0 systemd[1]: Starting Time & Date Service...
Sep 30 06:50:06 compute-0 systemd[1]: Started Time & Date Service.
Sep 30 06:50:06 compute-0 sudo[66261]: pam_unix(sudo:session): session closed for user root
Sep 30 06:50:06 compute-0 sudo[66417]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsszrsanldkmviyzrbnxzohicljgqjaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215006.329363-570-245238749053774/AnsiballZ_file.py'
Sep 30 06:50:06 compute-0 sudo[66417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:50:06 compute-0 python3.9[66419]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:50:06 compute-0 sudo[66417]: pam_unix(sudo:session): session closed for user root
Sep 30 06:50:07 compute-0 sudo[66569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-spdesbjzxyofhniypwrbbdrodctjdepj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215007.0859873-586-155066250346810/AnsiballZ_stat.py'
Sep 30 06:50:07 compute-0 sudo[66569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:50:07 compute-0 python3.9[66571]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:50:07 compute-0 sudo[66569]: pam_unix(sudo:session): session closed for user root
Sep 30 06:50:08 compute-0 sudo[66692]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-joqjzgpiofxmplkhikebowcjmhwhibku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215007.0859873-586-155066250346810/AnsiballZ_copy.py'
Sep 30 06:50:08 compute-0 sudo[66692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:50:08 compute-0 python3.9[66694]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759215007.0859873-586-155066250346810/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:50:08 compute-0 sudo[66692]: pam_unix(sudo:session): session closed for user root
Sep 30 06:50:08 compute-0 sudo[66844]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qodtvrkgsxvfoyprvehlpwxexcqjfqlj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215008.5546184-616-2395389580795/AnsiballZ_stat.py'
Sep 30 06:50:08 compute-0 sudo[66844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:50:09 compute-0 python3.9[66846]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:50:09 compute-0 sudo[66844]: pam_unix(sudo:session): session closed for user root
Sep 30 06:50:09 compute-0 sudo[66967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrvwjxvssezqhxjfdhxcxbcditnjheym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215008.5546184-616-2395389580795/AnsiballZ_copy.py'
Sep 30 06:50:09 compute-0 sudo[66967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:50:09 compute-0 python3.9[66969]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759215008.5546184-616-2395389580795/.source.yaml _original_basename=.v_0282a9 follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:50:09 compute-0 sudo[66967]: pam_unix(sudo:session): session closed for user root
Sep 30 06:50:10 compute-0 sudo[67119]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvmcmgngfenblezovsahssisglpiivxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215010.06824-646-33167664008831/AnsiballZ_stat.py'
Sep 30 06:50:10 compute-0 sudo[67119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:50:10 compute-0 python3.9[67121]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:50:10 compute-0 sudo[67119]: pam_unix(sudo:session): session closed for user root
Sep 30 06:50:11 compute-0 sudo[67242]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijnamkqpnzeodkophayjxhniynaolthi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215010.06824-646-33167664008831/AnsiballZ_copy.py'
Sep 30 06:50:11 compute-0 sudo[67242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:50:11 compute-0 python3.9[67244]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759215010.06824-646-33167664008831/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:50:11 compute-0 sudo[67242]: pam_unix(sudo:session): session closed for user root
Sep 30 06:50:11 compute-0 sudo[67394]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ksiljltlyqyifbosiwinnseshozcmasv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215011.5712352-676-140636742689522/AnsiballZ_command.py'
Sep 30 06:50:11 compute-0 sudo[67394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:50:12 compute-0 python3.9[67396]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:50:12 compute-0 sudo[67394]: pam_unix(sudo:session): session closed for user root
Sep 30 06:50:12 compute-0 sudo[67547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sufsukfywsatdsxrmwwtbgsrvqxcatsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215012.4963398-692-73228757078850/AnsiballZ_command.py'
Sep 30 06:50:12 compute-0 sudo[67547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:50:13 compute-0 python3.9[67549]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:50:13 compute-0 sudo[67547]: pam_unix(sudo:session): session closed for user root
Sep 30 06:50:13 compute-0 sudo[67700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yffbvvibqvyjbudkxrmtmqysxdyfqxkl ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759215013.3476768-708-49285793932209/AnsiballZ_edpm_nftables_from_files.py'
Sep 30 06:50:13 compute-0 sudo[67700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:50:14 compute-0 python3[67702]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Sep 30 06:50:14 compute-0 sudo[67700]: pam_unix(sudo:session): session closed for user root
Sep 30 06:50:14 compute-0 sudo[67852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzaxqozdkzgkbrggbyfaoeytcwcczugd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215014.428199-724-172564748019814/AnsiballZ_stat.py'
Sep 30 06:50:14 compute-0 sudo[67852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:50:15 compute-0 python3.9[67854]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:50:15 compute-0 sudo[67852]: pam_unix(sudo:session): session closed for user root
Sep 30 06:50:15 compute-0 sudo[67975]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rervhzjeqaeuiljhyhfgsqpdamrbilyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215014.428199-724-172564748019814/AnsiballZ_copy.py'
Sep 30 06:50:15 compute-0 sudo[67975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:50:15 compute-0 python3.9[67977]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759215014.428199-724-172564748019814/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:50:15 compute-0 sudo[67975]: pam_unix(sudo:session): session closed for user root
Sep 30 06:50:16 compute-0 sudo[68127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acfglvwlvsgumvwrqjscyacumxvfqayp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215016.0132744-754-16108185512112/AnsiballZ_stat.py'
Sep 30 06:50:16 compute-0 sudo[68127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:50:16 compute-0 python3.9[68129]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:50:16 compute-0 sudo[68127]: pam_unix(sudo:session): session closed for user root
Sep 30 06:50:17 compute-0 sudo[68250]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xclwvzphhltbmomuxdzmohiscdpgwgzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215016.0132744-754-16108185512112/AnsiballZ_copy.py'
Sep 30 06:50:17 compute-0 sudo[68250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:50:17 compute-0 python3.9[68252]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759215016.0132744-754-16108185512112/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:50:17 compute-0 sudo[68250]: pam_unix(sudo:session): session closed for user root
Sep 30 06:50:18 compute-0 sudo[68402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pulpzilwpmodpjevvqghrpgqmskptlek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215017.6066387-784-177456267742861/AnsiballZ_stat.py'
Sep 30 06:50:18 compute-0 sudo[68402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:50:18 compute-0 python3.9[68404]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:50:18 compute-0 sudo[68402]: pam_unix(sudo:session): session closed for user root
Sep 30 06:50:18 compute-0 sudo[68525]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqvsgebmrxvfebcectcvyxbxsnjgymzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215017.6066387-784-177456267742861/AnsiballZ_copy.py'
Sep 30 06:50:18 compute-0 sudo[68525]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:50:18 compute-0 python3.9[68527]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759215017.6066387-784-177456267742861/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:50:18 compute-0 sudo[68525]: pam_unix(sudo:session): session closed for user root
Sep 30 06:50:19 compute-0 sudo[68677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxqmqyijxrhzhizawaefjlvmsnutnduz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215019.1944249-814-202204111253198/AnsiballZ_stat.py'
Sep 30 06:50:19 compute-0 sudo[68677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:50:19 compute-0 python3.9[68679]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:50:19 compute-0 sudo[68677]: pam_unix(sudo:session): session closed for user root
Sep 30 06:50:20 compute-0 sudo[68800]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-memgkccexhyjchuiflgoeehseijjlayl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215019.1944249-814-202204111253198/AnsiballZ_copy.py'
Sep 30 06:50:20 compute-0 sudo[68800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:50:20 compute-0 python3.9[68802]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759215019.1944249-814-202204111253198/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:50:20 compute-0 sudo[68800]: pam_unix(sudo:session): session closed for user root
Sep 30 06:50:21 compute-0 sudo[68952]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjdgfndtbcqntsabkbwgmbwbkfqdynet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215020.6747408-844-54790703841140/AnsiballZ_stat.py'
Sep 30 06:50:21 compute-0 sudo[68952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:50:21 compute-0 python3.9[68954]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:50:21 compute-0 sudo[68952]: pam_unix(sudo:session): session closed for user root
Sep 30 06:50:21 compute-0 sudo[69075]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwxukviuidjkmfpuhobhglzepvzzgvwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215020.6747408-844-54790703841140/AnsiballZ_copy.py'
Sep 30 06:50:21 compute-0 sudo[69075]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:50:22 compute-0 python3.9[69077]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759215020.6747408-844-54790703841140/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:50:22 compute-0 sudo[69075]: pam_unix(sudo:session): session closed for user root
Sep 30 06:50:22 compute-0 sudo[69227]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwhlbmuaaivofkkfolmqzvdjhdcxoytn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215022.2829273-874-68848113411143/AnsiballZ_file.py'
Sep 30 06:50:22 compute-0 sudo[69227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:50:22 compute-0 python3.9[69229]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:50:22 compute-0 sudo[69227]: pam_unix(sudo:session): session closed for user root
Sep 30 06:50:23 compute-0 sudo[69379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfszupuuvguphdmdogexfviuiyrkbvil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215023.032621-890-185220428637976/AnsiballZ_command.py'
Sep 30 06:50:23 compute-0 sudo[69379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:50:23 compute-0 python3.9[69381]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:50:23 compute-0 sudo[69379]: pam_unix(sudo:session): session closed for user root
Sep 30 06:50:24 compute-0 sudo[69538]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ewfefrddmmqthcxfkgamltisyqqlbndi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215023.9418151-906-62120320868055/AnsiballZ_blockinfile.py'
Sep 30 06:50:24 compute-0 sudo[69538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:50:24 compute-0 python3.9[69540]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:50:24 compute-0 sudo[69538]: pam_unix(sudo:session): session closed for user root
Sep 30 06:50:25 compute-0 sudo[69691]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccswnjkddlrftziqjrwyfqswvjbdbokw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215025.130228-924-41078687606931/AnsiballZ_file.py'
Sep 30 06:50:25 compute-0 sudo[69691]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:50:25 compute-0 python3.9[69693]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:50:25 compute-0 sudo[69691]: pam_unix(sudo:session): session closed for user root
Sep 30 06:50:26 compute-0 sudo[69843]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahjomgbeyjsnvngllbkvabodivegnfey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215025.8493233-924-84689288296427/AnsiballZ_file.py'
Sep 30 06:50:26 compute-0 sudo[69843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:50:26 compute-0 python3.9[69845]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:50:26 compute-0 sudo[69843]: pam_unix(sudo:session): session closed for user root
Sep 30 06:50:27 compute-0 sudo[69995]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbplubbtdyupahfbsidkpeehimrkjjll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215026.6215305-954-120525980618907/AnsiballZ_mount.py'
Sep 30 06:50:27 compute-0 sudo[69995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:50:27 compute-0 python3.9[69997]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Sep 30 06:50:27 compute-0 sudo[69995]: pam_unix(sudo:session): session closed for user root
Sep 30 06:50:27 compute-0 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 06:50:27 compute-0 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 06:50:28 compute-0 sudo[70149]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bewedazkiwitdsbfjpteplhgcpnxmypc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215027.6298451-954-131217902306851/AnsiballZ_mount.py'
Sep 30 06:50:28 compute-0 sudo[70149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:50:28 compute-0 python3.9[70151]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Sep 30 06:50:28 compute-0 sudo[70149]: pam_unix(sudo:session): session closed for user root
Sep 30 06:50:28 compute-0 sshd-session[61139]: Connection closed by 192.168.122.30 port 52074
Sep 30 06:50:28 compute-0 sshd-session[61136]: pam_unix(sshd:session): session closed for user zuul
Sep 30 06:50:28 compute-0 systemd[1]: session-16.scope: Deactivated successfully.
Sep 30 06:50:28 compute-0 systemd[1]: session-16.scope: Consumed 42.358s CPU time.
Sep 30 06:50:28 compute-0 systemd-logind[824]: Session 16 logged out. Waiting for processes to exit.
Sep 30 06:50:28 compute-0 systemd-logind[824]: Removed session 16.
Sep 30 06:50:33 compute-0 sshd-session[70178]: Accepted publickey for zuul from 192.168.122.30 port 43046 ssh2: ECDSA SHA256:VgXY+3KEFg6ByVjpOVk/qpSKqXtLqTtx1W0gQMfs9wE
Sep 30 06:50:33 compute-0 systemd-logind[824]: New session 17 of user zuul.
Sep 30 06:50:33 compute-0 systemd[1]: Started Session 17 of User zuul.
Sep 30 06:50:33 compute-0 sshd-session[70178]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 06:50:34 compute-0 sudo[70331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdfxsmkkcdjjxnhanbuvdmedhmgrkjdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215034.0426185-17-3521889928963/AnsiballZ_tempfile.py'
Sep 30 06:50:34 compute-0 sudo[70331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:50:34 compute-0 python3.9[70333]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Sep 30 06:50:34 compute-0 sudo[70331]: pam_unix(sudo:session): session closed for user root
Sep 30 06:50:35 compute-0 sudo[70483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nemetmniwqlsfdfttawsfrcwooquafdb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215035.0341012-41-93906332176826/AnsiballZ_stat.py'
Sep 30 06:50:35 compute-0 sudo[70483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:50:35 compute-0 python3.9[70485]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 06:50:35 compute-0 sudo[70483]: pam_unix(sudo:session): session closed for user root
Sep 30 06:50:36 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Sep 30 06:50:36 compute-0 sudo[70638]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcjsfbdkkgqeirneskkyzrmzsfoidajc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215036.0805619-61-92921235829564/AnsiballZ_setup.py'
Sep 30 06:50:36 compute-0 sudo[70638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:50:36 compute-0 python3.9[70640]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 06:50:37 compute-0 sudo[70638]: pam_unix(sudo:session): session closed for user root
Sep 30 06:50:37 compute-0 sudo[70790]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phnqgbwrhpjzgitfeweqddsedlrnuklu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215037.3137228-78-25841045726422/AnsiballZ_blockinfile.py'
Sep 30 06:50:37 compute-0 sudo[70790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:50:38 compute-0 python3.9[70792]: ansible-ansible.builtin.blockinfile Invoked with block=compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCVmmgQAGOCFiNwjW5Lh4tC8H3kIKXZP6op45DpGawuDtqIavs5iIr4fKjnqhvE5gPtdtrtzKtiyHG0uN291TUMp5pBwr5wj3VkMqPJBGnXnEc2AZQG2A5D8aZ9tOT7/0tYB07odhIO9IdD+/LciAPmSYLgp40tCZtOLuq6b6fKgvt0uJntrmBOjoh5jkCJRkEkVAXDCC8HpGzZjNvcFpsbHMB1SiovYxqENKQrQdiYmsyFzSL4midDPWHzkon7CIa5Wp4C6xa1n0p2sfzZiwunfwRake6kaiHQoWGAZeazNpXVFk4qIwRIp6AtvqVwl+a1eHozPNW3PSZ4RZSR+n3CBSBfX6NJ25Ez5j2DpNHU3lH4bqTAYrwCJUShCXUiBhNsAyNjWAnAkcYF+Nbs8edIYiDOcIzuOV4usQa5CqrcOpSgOy3q8fMp7nSrB2jLecaf7J/urh+QlbB5fWXUXbAg14p5q+UOCGNfrEUzIrjrFZHEynEWkg/ynDLccQuhtqM=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHvbIRmRdH/9PR/tg5IA85SZcq1TQ6lD0MOvr1moocIw
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBEyMifCpNW7ctUpXIQzI67DJwdqwe9bDNpvK2gs3uDzdgl2dsffa3InAjmgfy0RQW3aXPf5HWfuQwRmPlUxz3mQ=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDFRzE9Fove/Plz6whm4ajGkaj5B984qRyoo8N7vvHqGCkjV5vyYZuMw2h6i1dAsvukFxLnntLbZhjH9BC8rjEISEarnZUJfDAmdA5pzVCzMQsEQv0yaByM+D7j3MLFNZUtzel9637f2LQwJLmqR3SQbCS3zCRTnEVFxsyTMgW1tkkjd7oSXDkPkI+hAiDysvqda3EErqf1Rufq93IljodKg7RkCwqI41gXBNKS2ExovXRZuTXGUAoCHJ3vPCiLWT/+kkSzg606sseAD15NFC5Os5p+4/T4GKsJuNfnLp7zpUGGbrnCVvCW3u97Kc56l0Qa2ZkXaAsWmSdyVPsbV8LrLTVsXzAceExF9JY7h0Y4n8/zM2AE+qxJCdv42zm0FDeGftISojyBbWtCwU8dXYJqaA0+X3Z1FzkuY5xf8xCvl6NRrZ4y6T44sKTQl8QLROoii7ISt/mAG5Ad24XpIKOMR2K3gX9cY6WKySAZ1TSLLmWm6urhDleQqq5bSm7yrIE=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDX2p9L8shL0AQ/R4TH8xaAF688Num0BuxCurKSr1vYr
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBIdwmuVfu8KKiKDFmmizqKYxenWhzTlI47WZsZBt2ZLvlm/zLvnf+8Y6/clwBZfEQYhfsVdYRgH7LO9uOyd9mw=
                                             create=True mode=0644 path=/tmp/ansible.1yqrafvy state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:50:38 compute-0 sudo[70790]: pam_unix(sudo:session): session closed for user root
Sep 30 06:50:38 compute-0 sudo[70942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zcgjyvnepdolgsbgqnjisxtllplrxtkf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215038.368676-94-190645384831870/AnsiballZ_command.py'
Sep 30 06:50:38 compute-0 sudo[70942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:50:39 compute-0 python3.9[70944]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.1yqrafvy' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:50:39 compute-0 sudo[70942]: pam_unix(sudo:session): session closed for user root
Sep 30 06:50:39 compute-0 sudo[71096]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijtxngdbonflmflnugxetcuolwnulfxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215039.322765-110-129004968973805/AnsiballZ_file.py'
Sep 30 06:50:39 compute-0 sudo[71096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:50:40 compute-0 python3.9[71098]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.1yqrafvy state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:50:40 compute-0 sudo[71096]: pam_unix(sudo:session): session closed for user root
Sep 30 06:50:40 compute-0 sshd-session[70181]: Connection closed by 192.168.122.30 port 43046
Sep 30 06:50:40 compute-0 sshd-session[70178]: pam_unix(sshd:session): session closed for user zuul
Sep 30 06:50:40 compute-0 systemd[1]: session-17.scope: Deactivated successfully.
Sep 30 06:50:40 compute-0 systemd[1]: session-17.scope: Consumed 4.014s CPU time.
Sep 30 06:50:40 compute-0 systemd-logind[824]: Session 17 logged out. Waiting for processes to exit.
Sep 30 06:50:40 compute-0 systemd-logind[824]: Removed session 17.
Sep 30 06:50:45 compute-0 sshd-session[71123]: Accepted publickey for zuul from 192.168.122.30 port 58670 ssh2: ECDSA SHA256:VgXY+3KEFg6ByVjpOVk/qpSKqXtLqTtx1W0gQMfs9wE
Sep 30 06:50:45 compute-0 systemd-logind[824]: New session 18 of user zuul.
Sep 30 06:50:45 compute-0 systemd[1]: Started Session 18 of User zuul.
Sep 30 06:50:45 compute-0 sshd-session[71123]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 06:50:46 compute-0 python3.9[71276]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 06:50:48 compute-0 sudo[71430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-trsmbrsuyvsuisalipvipbldjjaygjew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215047.42838-44-249455843329095/AnsiballZ_systemd.py'
Sep 30 06:50:48 compute-0 sudo[71430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:50:48 compute-0 python3.9[71432]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Sep 30 06:50:48 compute-0 sudo[71430]: pam_unix(sudo:session): session closed for user root
Sep 30 06:50:49 compute-0 sudo[71584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnuwzxgejttuqourtrkscgbnbelbeipm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215048.651709-60-137519304204360/AnsiballZ_systemd.py'
Sep 30 06:50:49 compute-0 sudo[71584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:50:49 compute-0 python3.9[71586]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 06:50:49 compute-0 sudo[71584]: pam_unix(sudo:session): session closed for user root
Sep 30 06:50:50 compute-0 sudo[71737]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgoarkrvivtrhwnldtpdyyppnarczmdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215049.7381814-80-246981145427728/AnsiballZ_command.py'
Sep 30 06:50:50 compute-0 sudo[71737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:50:50 compute-0 python3.9[71739]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:50:50 compute-0 sudo[71737]: pam_unix(sudo:session): session closed for user root
Sep 30 06:50:51 compute-0 sudo[71890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svnaeqrcpdhfyugsqchdnhcawyfmfivl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215050.7606747-96-163769863212574/AnsiballZ_stat.py'
Sep 30 06:50:51 compute-0 sudo[71890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:50:51 compute-0 python3.9[71892]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 06:50:51 compute-0 sudo[71890]: pam_unix(sudo:session): session closed for user root
Sep 30 06:50:52 compute-0 sudo[72044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eapcltwpmzjtgddmdmhblmzxojbuybim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215051.692848-112-241253968752542/AnsiballZ_command.py'
Sep 30 06:50:52 compute-0 sudo[72044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:50:52 compute-0 python3.9[72046]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:50:52 compute-0 sudo[72044]: pam_unix(sudo:session): session closed for user root
Sep 30 06:50:53 compute-0 sudo[72199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mebhizjanushobvfdxbaywkyqvrovayz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215052.559825-128-123251773678124/AnsiballZ_file.py'
Sep 30 06:50:53 compute-0 sudo[72199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:50:53 compute-0 python3.9[72201]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:50:53 compute-0 sudo[72199]: pam_unix(sudo:session): session closed for user root
Sep 30 06:50:53 compute-0 sshd-session[71126]: Connection closed by 192.168.122.30 port 58670
Sep 30 06:50:53 compute-0 sshd-session[71123]: pam_unix(sshd:session): session closed for user zuul
Sep 30 06:50:53 compute-0 systemd[1]: session-18.scope: Deactivated successfully.
Sep 30 06:50:53 compute-0 systemd[1]: session-18.scope: Consumed 5.500s CPU time.
Sep 30 06:50:53 compute-0 systemd-logind[824]: Session 18 logged out. Waiting for processes to exit.
Sep 30 06:50:53 compute-0 systemd-logind[824]: Removed session 18.
Sep 30 06:50:58 compute-0 sshd-session[72227]: Accepted publickey for zuul from 192.168.122.30 port 45554 ssh2: ECDSA SHA256:VgXY+3KEFg6ByVjpOVk/qpSKqXtLqTtx1W0gQMfs9wE
Sep 30 06:50:58 compute-0 systemd-logind[824]: New session 19 of user zuul.
Sep 30 06:50:58 compute-0 systemd[1]: Started Session 19 of User zuul.
Sep 30 06:50:58 compute-0 sshd-session[72227]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 06:50:59 compute-0 python3.9[72380]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 06:51:00 compute-0 sudo[72536]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsxndrltglkzsvsbiwbuthsxtqbwwqjy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215060.5379307-48-273711593115029/AnsiballZ_setup.py'
Sep 30 06:51:00 compute-0 sudo[72536]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:01 compute-0 python3.9[72538]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 06:51:01 compute-0 sudo[72536]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:01 compute-0 sudo[72620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxpyaaxacppselrubkurqvbevvyiiexd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215060.5379307-48-273711593115029/AnsiballZ_dnf.py'
Sep 30 06:51:01 compute-0 sudo[72620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:02 compute-0 sshd-session[72452]: Received disconnect from 152.32.253.152 port 54734:11: Bye Bye [preauth]
Sep 30 06:51:02 compute-0 sshd-session[72452]: Disconnected from authenticating user root 152.32.253.152 port 54734 [preauth]
Sep 30 06:51:02 compute-0 python3.9[72622]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Sep 30 06:51:03 compute-0 sudo[72620]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:04 compute-0 python3.9[72773]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:51:05 compute-0 python3.9[72924]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Sep 30 06:51:06 compute-0 python3.9[73074]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 06:51:07 compute-0 python3.9[73224]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 06:51:07 compute-0 sshd-session[72230]: Connection closed by 192.168.122.30 port 45554
Sep 30 06:51:07 compute-0 sshd-session[72227]: pam_unix(sshd:session): session closed for user zuul
Sep 30 06:51:07 compute-0 systemd[1]: session-19.scope: Deactivated successfully.
Sep 30 06:51:07 compute-0 systemd[1]: session-19.scope: Consumed 6.404s CPU time.
Sep 30 06:51:07 compute-0 systemd-logind[824]: Session 19 logged out. Waiting for processes to exit.
Sep 30 06:51:07 compute-0 systemd-logind[824]: Removed session 19.
Sep 30 06:51:13 compute-0 sshd-session[73249]: Accepted publickey for zuul from 192.168.122.30 port 42354 ssh2: ECDSA SHA256:VgXY+3KEFg6ByVjpOVk/qpSKqXtLqTtx1W0gQMfs9wE
Sep 30 06:51:13 compute-0 systemd-logind[824]: New session 20 of user zuul.
Sep 30 06:51:13 compute-0 systemd[1]: Started Session 20 of User zuul.
Sep 30 06:51:13 compute-0 sshd-session[73249]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 06:51:14 compute-0 python3.9[73402]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 06:51:16 compute-0 sudo[73556]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zhjiztmxwnrnyedvcyzpftzlnvlxubdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215075.653675-83-76080625323390/AnsiballZ_file.py'
Sep 30 06:51:16 compute-0 sudo[73556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:16 compute-0 python3.9[73558]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:51:16 compute-0 sudo[73556]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:16 compute-0 sudo[73708]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jddfthixqngdsnwzxnwvbledifuzkgwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215076.4253998-83-259744542896148/AnsiballZ_file.py'
Sep 30 06:51:16 compute-0 sudo[73708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:17 compute-0 python3.9[73710]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:51:17 compute-0 sudo[73708]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:17 compute-0 sudo[73860]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucvqfegccpicbammfvteeirniwgskpry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215077.3140683-116-95666118598444/AnsiballZ_stat.py'
Sep 30 06:51:17 compute-0 sudo[73860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:17 compute-0 python3.9[73862]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:51:17 compute-0 sudo[73860]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:18 compute-0 sudo[73983]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijrufobiljskwdjpplyolqjvvubshlgo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215077.3140683-116-95666118598444/AnsiballZ_copy.py'
Sep 30 06:51:18 compute-0 sudo[73983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:18 compute-0 python3.9[73985]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759215077.3140683-116-95666118598444/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=cd7e869f8c643d640a246a5f83fdd277bff2c13e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:51:18 compute-0 sudo[73983]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:19 compute-0 sudo[74135]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhcenitcjimdflkmgclcifkxlsnwdtjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215078.795678-116-41944068572963/AnsiballZ_stat.py'
Sep 30 06:51:19 compute-0 sudo[74135]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:19 compute-0 python3.9[74137]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:51:19 compute-0 sudo[74135]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:19 compute-0 sudo[74258]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbgzryceupgsujhwprvjqnliqtxqucfl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215078.795678-116-41944068572963/AnsiballZ_copy.py'
Sep 30 06:51:19 compute-0 sudo[74258]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:20 compute-0 python3.9[74260]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759215078.795678-116-41944068572963/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=bd4585f7450f55a7caf27354254f65fa6296e8c4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:51:20 compute-0 sudo[74258]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:20 compute-0 sudo[74410]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khwlooqimosxwfcyfcvjjbgwszoggogz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215080.2841475-116-271377308095693/AnsiballZ_stat.py'
Sep 30 06:51:20 compute-0 sudo[74410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:20 compute-0 python3.9[74412]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:51:20 compute-0 sudo[74410]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:21 compute-0 sudo[74533]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awxswlpmhkhxasscbjluwpzitwczthgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215080.2841475-116-271377308095693/AnsiballZ_copy.py'
Sep 30 06:51:21 compute-0 sudo[74533]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:21 compute-0 python3.9[74535]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759215080.2841475-116-271377308095693/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=9f284e19bba1e8cd4e1a5015b462cf554a9b9b9a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:51:21 compute-0 sudo[74533]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:22 compute-0 sudo[74685]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uttqjotgizxsgaahbnpppulizghphuru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215081.830485-206-56757894016972/AnsiballZ_file.py'
Sep 30 06:51:22 compute-0 sudo[74685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:22 compute-0 python3.9[74687]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:51:22 compute-0 sudo[74685]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:23 compute-0 sudo[74837]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnuuxgzjpszuqdkajpahnmwjqjwfjhre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215082.6115239-206-75849060288330/AnsiballZ_file.py'
Sep 30 06:51:23 compute-0 sudo[74837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:23 compute-0 python3.9[74839]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:51:23 compute-0 sudo[74837]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:23 compute-0 sudo[74989]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okyffjvoyyefyooopeuhfnuyumuroorl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215083.509155-242-248612752157897/AnsiballZ_stat.py'
Sep 30 06:51:23 compute-0 sudo[74989]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:24 compute-0 python3.9[74991]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:51:24 compute-0 sudo[74989]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:24 compute-0 sudo[75112]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgqnsrobdxcwncssachpfzgnjuirqtlq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215083.509155-242-248612752157897/AnsiballZ_copy.py'
Sep 30 06:51:24 compute-0 sudo[75112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:24 compute-0 python3.9[75114]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759215083.509155-242-248612752157897/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=5025a25a0f1102a07818fc89324ceb273f326298 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:51:24 compute-0 sudo[75112]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:25 compute-0 sudo[75264]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnlttgxuvmngyufbedxwvhymhwwbxjak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215084.9408567-242-35232854584336/AnsiballZ_stat.py'
Sep 30 06:51:25 compute-0 sudo[75264]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:25 compute-0 python3.9[75266]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:51:25 compute-0 sudo[75264]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:26 compute-0 sudo[75387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cjruhzdzitdymhfcgsosdrxnkzitemnt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215084.9408567-242-35232854584336/AnsiballZ_copy.py'
Sep 30 06:51:26 compute-0 sudo[75387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:26 compute-0 python3.9[75389]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759215084.9408567-242-35232854584336/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=fa31c1df69e6d2533cf3f6266397eb7efa763d76 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:51:26 compute-0 sudo[75387]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:26 compute-0 sudo[75539]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqgruozbaziudeosjwxcmhjthgjueonm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215086.4944446-242-229872190903696/AnsiballZ_stat.py'
Sep 30 06:51:26 compute-0 sudo[75539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:27 compute-0 python3.9[75541]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:51:27 compute-0 sudo[75539]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:27 compute-0 sudo[75662]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swppnvnpowpgvtmkegfqkfdjmlrbdrmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215086.4944446-242-229872190903696/AnsiballZ_copy.py'
Sep 30 06:51:27 compute-0 sudo[75662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:27 compute-0 python3.9[75664]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759215086.4944446-242-229872190903696/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=707fd35735873875f30431e0a55017db63992e60 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:51:27 compute-0 sudo[75662]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:28 compute-0 sudo[75814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zletrajkfvntmeytcnimtxhqwjajdwwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215088.0340412-333-231138959378455/AnsiballZ_file.py'
Sep 30 06:51:28 compute-0 sudo[75814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:28 compute-0 python3.9[75816]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:51:28 compute-0 sudo[75814]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:29 compute-0 sudo[75966]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aczrajsyprmnznbxpbftvgnqnjdbioqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215088.7834191-333-260786886748062/AnsiballZ_file.py'
Sep 30 06:51:29 compute-0 sudo[75966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:29 compute-0 python3.9[75968]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:51:29 compute-0 sudo[75966]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:29 compute-0 sudo[76118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgwnaubtekusuygyhxnqxprvswdddlph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215089.62282-367-33239173379818/AnsiballZ_stat.py'
Sep 30 06:51:29 compute-0 sudo[76118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:30 compute-0 python3.9[76120]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:51:30 compute-0 sudo[76118]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:30 compute-0 chronyd[61110]: Selected source 23.133.168.246 (pool.ntp.org)
Sep 30 06:51:30 compute-0 sudo[76241]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hccautdivbgjbhhnldocqabchwmabvdd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215089.62282-367-33239173379818/AnsiballZ_copy.py'
Sep 30 06:51:30 compute-0 sudo[76241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:30 compute-0 python3.9[76243]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759215089.62282-367-33239173379818/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=2cede445d7a431431e2fa51e4f11a180541babb8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:51:30 compute-0 sudo[76241]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:31 compute-0 sudo[76393]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aokcindetsvakouvotfvhpiegxhiajrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215091.0352867-367-165130220862969/AnsiballZ_stat.py'
Sep 30 06:51:31 compute-0 sudo[76393]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:31 compute-0 python3.9[76395]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:51:31 compute-0 sudo[76393]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:32 compute-0 sudo[76516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfzdygypmxidyzccnoztjxvknxitmnxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215091.0352867-367-165130220862969/AnsiballZ_copy.py'
Sep 30 06:51:32 compute-0 sudo[76516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:32 compute-0 python3.9[76518]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759215091.0352867-367-165130220862969/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=074bb1bb34bb1340071108f6438466cd5aac603d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:51:32 compute-0 sudo[76516]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:32 compute-0 sudo[76668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agnglitopphupxocjdkydpkluwovhwit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215092.4070961-367-122401996894236/AnsiballZ_stat.py'
Sep 30 06:51:32 compute-0 sudo[76668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:33 compute-0 python3.9[76670]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:51:33 compute-0 sudo[76668]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:33 compute-0 sudo[76791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btaaanfpmkrlzioxnrawfunskjijadiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215092.4070961-367-122401996894236/AnsiballZ_copy.py'
Sep 30 06:51:33 compute-0 sudo[76791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:33 compute-0 python3.9[76793]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759215092.4070961-367-122401996894236/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=f35348727be5bc1461874b7e562302e334ee2e56 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:51:33 compute-0 sudo[76791]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:34 compute-0 sudo[76943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxhumgenhrhjpaclamvpvldctstgwtjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215094.0406928-463-61321127213206/AnsiballZ_file.py'
Sep 30 06:51:34 compute-0 sudo[76943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:34 compute-0 python3.9[76945]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:51:34 compute-0 sudo[76943]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:35 compute-0 sudo[77095]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhdfcpcfdevzhewvoaeqafbnpusrvlqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215094.8760157-463-17822290732215/AnsiballZ_file.py'
Sep 30 06:51:35 compute-0 sudo[77095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:35 compute-0 python3.9[77097]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:51:35 compute-0 sudo[77095]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:36 compute-0 sudo[77247]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obwvgsblzumatftiiskymyvinsqhifpi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215095.6555638-495-227859899776425/AnsiballZ_stat.py'
Sep 30 06:51:36 compute-0 sudo[77247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:36 compute-0 python3.9[77249]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:51:36 compute-0 sudo[77247]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:36 compute-0 sudo[77370]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrspayznsowoudbmhzytowrkjneebxns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215095.6555638-495-227859899776425/AnsiballZ_copy.py'
Sep 30 06:51:36 compute-0 sudo[77370]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:36 compute-0 python3.9[77372]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759215095.6555638-495-227859899776425/.source.crt _original_basename=compute-0.ctlplane.example.com-tls.crt follow=False checksum=8132130a2c7343dd650a45137aa83ed00522c2da backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:51:36 compute-0 sudo[77370]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:37 compute-0 sudo[77522]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihybfvoesnpfkughbdruczrrzstwwtvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215097.1474788-495-47234625689553/AnsiballZ_stat.py'
Sep 30 06:51:37 compute-0 sudo[77522]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:37 compute-0 python3.9[77524]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:51:37 compute-0 sudo[77522]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:38 compute-0 sudo[77645]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxfflythvirdrarzjoagxfhewutwvupp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215097.1474788-495-47234625689553/AnsiballZ_copy.py'
Sep 30 06:51:38 compute-0 sudo[77645]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:38 compute-0 python3.9[77647]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759215097.1474788-495-47234625689553/.source.crt _original_basename=compute-0.ctlplane.example.com-ca.crt follow=False checksum=074bb1bb34bb1340071108f6438466cd5aac603d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:51:38 compute-0 sudo[77645]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:38 compute-0 sudo[77797]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khvqiztiyzklxnxpwkjkaxhoylaxdzqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215098.6051874-495-220135801474092/AnsiballZ_stat.py'
Sep 30 06:51:38 compute-0 sudo[77797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:39 compute-0 python3.9[77799]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:51:39 compute-0 sudo[77797]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:39 compute-0 sudo[77920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kcqnpbypckempjqcerrlckvnhfceslws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215098.6051874-495-220135801474092/AnsiballZ_copy.py'
Sep 30 06:51:39 compute-0 sudo[77920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:39 compute-0 python3.9[77922]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759215098.6051874-495-220135801474092/.source.key _original_basename=compute-0.ctlplane.example.com-tls.key follow=False checksum=6f8f5e91fe89e75de126223eccf55e09438f944c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:51:39 compute-0 sudo[77920]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:41 compute-0 sudo[78072]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqdlzzqclcmabtrtphebcwqnhfzpncst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215100.7874897-623-54224336228345/AnsiballZ_file.py'
Sep 30 06:51:41 compute-0 sudo[78072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:41 compute-0 python3.9[78074]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:51:41 compute-0 sudo[78072]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:42 compute-0 sudo[78224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aasnxtryqqzzkvrzgobdhpprgzzekmjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215101.614278-638-277390890524777/AnsiballZ_stat.py'
Sep 30 06:51:42 compute-0 sudo[78224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:42 compute-0 python3.9[78226]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:51:42 compute-0 sudo[78224]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:42 compute-0 sudo[78347]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtomyovqyohuflnzplzpqznjkxzdkaou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215101.614278-638-277390890524777/AnsiballZ_copy.py'
Sep 30 06:51:42 compute-0 sudo[78347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:42 compute-0 python3.9[78349]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759215101.614278-638-277390890524777/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f1630328830f6e47b98e9515af0d5e894e85cff4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:51:42 compute-0 sudo[78347]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:43 compute-0 sudo[78499]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skrgdebnidmpdvvpjcidavdvtzsirhmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215103.1694105-671-169673408039812/AnsiballZ_file.py'
Sep 30 06:51:43 compute-0 sudo[78499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:43 compute-0 python3.9[78501]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:51:43 compute-0 sudo[78499]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:44 compute-0 sudo[78651]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzrlomwvsyktzjdspwavuaduutuurjgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215104.1207652-690-120668348344298/AnsiballZ_stat.py'
Sep 30 06:51:44 compute-0 sudo[78651]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:44 compute-0 python3.9[78653]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:51:44 compute-0 sudo[78651]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:45 compute-0 sudo[78774]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urvdlnzokxcnlmymogbpesotakdpzuxv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215104.1207652-690-120668348344298/AnsiballZ_copy.py'
Sep 30 06:51:45 compute-0 sudo[78774]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:45 compute-0 python3.9[78776]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759215104.1207652-690-120668348344298/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f1630328830f6e47b98e9515af0d5e894e85cff4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:51:45 compute-0 sudo[78774]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:46 compute-0 sudo[78926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekygndllphkdnnunncuaugdbyavqfamk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215105.6833782-724-174219154340724/AnsiballZ_file.py'
Sep 30 06:51:46 compute-0 sudo[78926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:46 compute-0 python3.9[78928]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:51:46 compute-0 sudo[78926]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:46 compute-0 sudo[79078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-entrxgvmmzgxeekoiwifbmkurhzibltr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215106.5075164-741-87237335261374/AnsiballZ_stat.py'
Sep 30 06:51:46 compute-0 sudo[79078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:47 compute-0 python3.9[79080]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:51:47 compute-0 sudo[79078]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:47 compute-0 sudo[79201]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-daertgfcgpwtyrrdsbwhjxiivriuffef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215106.5075164-741-87237335261374/AnsiballZ_copy.py'
Sep 30 06:51:47 compute-0 sudo[79201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:47 compute-0 python3.9[79203]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759215106.5075164-741-87237335261374/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f1630328830f6e47b98e9515af0d5e894e85cff4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:51:47 compute-0 sudo[79201]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:48 compute-0 sudo[79353]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjjsmvnotbkphmdnkogcbegdnczswvpj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215108.0459595-774-58812012064123/AnsiballZ_file.py'
Sep 30 06:51:48 compute-0 sudo[79353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:48 compute-0 python3.9[79355]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:51:48 compute-0 sudo[79353]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:49 compute-0 sudo[79505]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbrsjjzadltwuqqszpcsngxpgtrrjodt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215108.8552022-791-164860011400091/AnsiballZ_stat.py'
Sep 30 06:51:49 compute-0 sudo[79505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:49 compute-0 python3.9[79507]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:51:49 compute-0 sudo[79505]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:49 compute-0 sudo[79628]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xncnuuckxqtcauqjihrlyjvaarjmdyjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215108.8552022-791-164860011400091/AnsiballZ_copy.py'
Sep 30 06:51:49 compute-0 sudo[79628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:50 compute-0 python3.9[79630]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759215108.8552022-791-164860011400091/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f1630328830f6e47b98e9515af0d5e894e85cff4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:51:50 compute-0 sudo[79628]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:50 compute-0 sudo[79780]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnjyzfiedlbmkgcwlynbwnsrcttuersv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215110.400389-823-17515155433681/AnsiballZ_file.py'
Sep 30 06:51:50 compute-0 sudo[79780]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:51 compute-0 python3.9[79782]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:51:51 compute-0 sudo[79780]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:51 compute-0 sudo[79932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-piwzuomcicpgdzfqoxolhcejfcvfjrhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215111.2476614-840-26268131539533/AnsiballZ_stat.py'
Sep 30 06:51:51 compute-0 sudo[79932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:51 compute-0 python3.9[79934]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:51:51 compute-0 sudo[79932]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:52 compute-0 sudo[80055]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzdbnewldvwikgofppajyxongyfkzytz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215111.2476614-840-26268131539533/AnsiballZ_copy.py'
Sep 30 06:51:52 compute-0 sudo[80055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:52 compute-0 python3.9[80057]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759215111.2476614-840-26268131539533/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f1630328830f6e47b98e9515af0d5e894e85cff4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:51:52 compute-0 sudo[80055]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:53 compute-0 sudo[80207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vqwexuoxiitfnpovmczobudridhazugc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215113.0454085-871-143218846234060/AnsiballZ_file.py'
Sep 30 06:51:53 compute-0 sudo[80207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:53 compute-0 python3.9[80209]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:51:53 compute-0 sudo[80207]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:54 compute-0 sudo[80359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbctjjzjcokoqmghtieudfdnvhgrxvku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215113.8775377-879-132524295489043/AnsiballZ_stat.py'
Sep 30 06:51:54 compute-0 sudo[80359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:54 compute-0 python3.9[80361]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:51:54 compute-0 sudo[80359]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:54 compute-0 sudo[80482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enrkvxfksnpypusniubibzbfobjggdau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215113.8775377-879-132524295489043/AnsiballZ_copy.py'
Sep 30 06:51:54 compute-0 sudo[80482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:55 compute-0 python3.9[80484]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759215113.8775377-879-132524295489043/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f1630328830f6e47b98e9515af0d5e894e85cff4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:51:55 compute-0 sudo[80482]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:55 compute-0 sudo[80634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wcfxilvhihjhzcqmhxhfachrhdouqquk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215115.458157-895-257462040442690/AnsiballZ_file.py'
Sep 30 06:51:55 compute-0 sudo[80634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:56 compute-0 python3.9[80636]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:51:56 compute-0 sudo[80634]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:56 compute-0 sudo[80786]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvfhzqndbvcbergnsnzgajzgftqyzkbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215116.2865276-903-91786493077516/AnsiballZ_stat.py'
Sep 30 06:51:56 compute-0 sudo[80786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:56 compute-0 python3.9[80788]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:51:56 compute-0 sudo[80786]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:57 compute-0 sudo[80909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwufpxcxcvsbkdrhjfeiyacgrglcvbrt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215116.2865276-903-91786493077516/AnsiballZ_copy.py'
Sep 30 06:51:57 compute-0 sudo[80909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:51:57 compute-0 python3.9[80911]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759215116.2865276-903-91786493077516/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=f1630328830f6e47b98e9515af0d5e894e85cff4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:51:57 compute-0 sudo[80909]: pam_unix(sudo:session): session closed for user root
Sep 30 06:51:57 compute-0 sshd-session[73252]: Connection closed by 192.168.122.30 port 42354
Sep 30 06:51:57 compute-0 sshd-session[73249]: pam_unix(sshd:session): session closed for user zuul
Sep 30 06:51:57 compute-0 systemd[1]: session-20.scope: Deactivated successfully.
Sep 30 06:51:57 compute-0 systemd[1]: session-20.scope: Consumed 35.845s CPU time.
Sep 30 06:51:57 compute-0 systemd-logind[824]: Session 20 logged out. Waiting for processes to exit.
Sep 30 06:51:57 compute-0 systemd-logind[824]: Removed session 20.
Sep 30 06:52:03 compute-0 sshd-session[80939]: Accepted publickey for zuul from 192.168.122.30 port 44894 ssh2: ECDSA SHA256:VgXY+3KEFg6ByVjpOVk/qpSKqXtLqTtx1W0gQMfs9wE
Sep 30 06:52:03 compute-0 systemd-logind[824]: New session 21 of user zuul.
Sep 30 06:52:03 compute-0 systemd[1]: Started Session 21 of User zuul.
Sep 30 06:52:03 compute-0 sshd-session[80939]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 06:52:04 compute-0 sshd-session[80937]: Received disconnect from 152.32.253.152 port 50052:11: Bye Bye [preauth]
Sep 30 06:52:04 compute-0 sshd-session[80937]: Disconnected from authenticating user root 152.32.253.152 port 50052 [preauth]
Sep 30 06:52:04 compute-0 python3.9[81092]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 06:52:05 compute-0 sudo[81246]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-muafusfkghweallqhffihsfgikdgbnjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215125.0401323-48-51928749892462/AnsiballZ_file.py'
Sep 30 06:52:05 compute-0 sudo[81246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:05 compute-0 python3.9[81248]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:52:05 compute-0 sudo[81246]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:06 compute-0 sudo[81398]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-voddsocwakqwqcwmdqatrejfxirwxbwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215126.0229483-48-223459612551374/AnsiballZ_file.py'
Sep 30 06:52:06 compute-0 sudo[81398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:06 compute-0 python3.9[81400]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:52:06 compute-0 sudo[81398]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:07 compute-0 python3.9[81550]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 06:52:08 compute-0 sudo[81700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtnxfuefbjetyyjynvexoygfhkobnbsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215127.8160553-94-20205809350198/AnsiballZ_seboolean.py'
Sep 30 06:52:08 compute-0 sudo[81700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:08 compute-0 python3.9[81702]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Sep 30 06:52:09 compute-0 sudo[81700]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:10 compute-0 sudo[81856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijfqdyfobaldcfonsllfeqempasyiyjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215129.9988556-114-19151262388597/AnsiballZ_setup.py'
Sep 30 06:52:10 compute-0 dbus-broker-launch[814]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Sep 30 06:52:10 compute-0 sudo[81856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:10 compute-0 python3.9[81858]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 06:52:10 compute-0 sudo[81856]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:11 compute-0 sudo[81940]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjhdvmknxdcdwifovzkqcmbulumfzqbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215129.9988556-114-19151262388597/AnsiballZ_dnf.py'
Sep 30 06:52:11 compute-0 sudo[81940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:11 compute-0 python3.9[81942]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 06:52:12 compute-0 sudo[81940]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:13 compute-0 sudo[82093]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzmermpfahpvdlejhvbyudvllruskcaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215133.058594-138-118126398581448/AnsiballZ_systemd.py'
Sep 30 06:52:13 compute-0 sudo[82093]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:14 compute-0 python3.9[82095]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Sep 30 06:52:14 compute-0 sudo[82093]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:14 compute-0 sudo[82248]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jgdvojsaapnhgvdkvelcejkawykbmqdn ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759215134.435964-154-274635126314298/AnsiballZ_edpm_nftables_snippet.py'
Sep 30 06:52:14 compute-0 sudo[82248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:15 compute-0 python3[82250]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                            rule:
                                              proto: udp
                                              dport: 4789
                                          - rule_name: 119 neutron geneve networks
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              state: ["UNTRACKED"]
                                          - rule_name: 120 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: OUTPUT
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                          - rule_name: 121 neutron geneve networks no conntrack
                                            rule:
                                              proto: udp
                                              dport: 6081
                                              table: raw
                                              chain: PREROUTING
                                              jump: NOTRACK
                                              action: append
                                              state: []
                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Sep 30 06:52:15 compute-0 sudo[82248]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:15 compute-0 sudo[82400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngbmnjuxhjoedphmnskhxomaipzcuplv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215135.573247-172-95196291953639/AnsiballZ_file.py'
Sep 30 06:52:15 compute-0 sudo[82400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:16 compute-0 python3.9[82402]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:52:16 compute-0 sudo[82400]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:16 compute-0 sudo[82552]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpskfhskivyetqynqwyqzpzphqoibdqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215136.3757393-188-21384296213036/AnsiballZ_stat.py'
Sep 30 06:52:16 compute-0 sudo[82552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:17 compute-0 python3.9[82554]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:52:17 compute-0 sudo[82552]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:17 compute-0 sudo[82630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qszjhiqtyqfxyimmooqoshoxfovrpmsa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215136.3757393-188-21384296213036/AnsiballZ_file.py'
Sep 30 06:52:17 compute-0 sudo[82630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:17 compute-0 python3.9[82632]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:52:17 compute-0 sudo[82630]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:18 compute-0 sudo[82782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sloovdmlrvsodzjwdziomyhlumokkbmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215137.9908705-212-85497150517440/AnsiballZ_stat.py'
Sep 30 06:52:18 compute-0 sudo[82782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:18 compute-0 python3.9[82784]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:52:18 compute-0 sudo[82782]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:18 compute-0 sudo[82860]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-isuqkvoqbfklpaovkrvyfmcnnmayfmbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215137.9908705-212-85497150517440/AnsiballZ_file.py'
Sep 30 06:52:18 compute-0 sudo[82860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:19 compute-0 python3.9[82862]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.03um6_bf recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:52:19 compute-0 sudo[82860]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:19 compute-0 sudo[83012]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajcuxambjzfizddqmvgfpgpxeqnyvaes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215139.4084308-236-121708534843899/AnsiballZ_stat.py'
Sep 30 06:52:19 compute-0 sudo[83012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:20 compute-0 python3.9[83014]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:52:20 compute-0 sudo[83012]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:20 compute-0 sudo[83090]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-habatvbcsitzgfhujeuywqrsuerwlazg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215139.4084308-236-121708534843899/AnsiballZ_file.py'
Sep 30 06:52:20 compute-0 sudo[83090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:20 compute-0 python3.9[83092]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:52:20 compute-0 sudo[83090]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:21 compute-0 sudo[83242]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxygeaigoawvtikixzsoymuznlsrvfor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215140.823381-262-185043086222895/AnsiballZ_command.py'
Sep 30 06:52:21 compute-0 sudo[83242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:21 compute-0 python3.9[83244]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:52:21 compute-0 sudo[83242]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:22 compute-0 sudo[83395]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgggcjybextwhvvzantlvconayripksv ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759215141.8360276-278-227393104645792/AnsiballZ_edpm_nftables_from_files.py'
Sep 30 06:52:22 compute-0 sudo[83395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:22 compute-0 python3[83397]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Sep 30 06:52:22 compute-0 sudo[83395]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:23 compute-0 sudo[83547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjbpchbwnioltgzwucrsjmoadffmcolu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215142.8781376-294-82937180343128/AnsiballZ_stat.py'
Sep 30 06:52:23 compute-0 sudo[83547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:23 compute-0 python3.9[83549]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:52:23 compute-0 sudo[83547]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:24 compute-0 sudo[83672]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qixrlwtskgjcuwyhdmuoxqlkkbesaxrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215142.8781376-294-82937180343128/AnsiballZ_copy.py'
Sep 30 06:52:24 compute-0 sudo[83672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:24 compute-0 python3.9[83674]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759215142.8781376-294-82937180343128/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:52:24 compute-0 sudo[83672]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:25 compute-0 sudo[83824]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icpuvcuggvtdrzcjkagkxsnesaetyqmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215144.5794265-324-262584560365665/AnsiballZ_stat.py'
Sep 30 06:52:25 compute-0 sudo[83824]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:25 compute-0 python3.9[83826]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:52:25 compute-0 sudo[83824]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:25 compute-0 sudo[83949]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esblctvwwlmnusihrpnnmsfdxowsqjbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215144.5794265-324-262584560365665/AnsiballZ_copy.py'
Sep 30 06:52:25 compute-0 sudo[83949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:25 compute-0 python3.9[83951]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759215144.5794265-324-262584560365665/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:52:25 compute-0 sudo[83949]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:26 compute-0 sudo[84101]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcibyacdhybammqwppsfswpnrgjuanpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215146.1973393-354-80408034775718/AnsiballZ_stat.py'
Sep 30 06:52:26 compute-0 sudo[84101]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:26 compute-0 python3.9[84103]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:52:26 compute-0 sudo[84101]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:27 compute-0 sudo[84226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgsjarqygdjhtixqncddbuhlxdlvyjmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215146.1973393-354-80408034775718/AnsiballZ_copy.py'
Sep 30 06:52:27 compute-0 sudo[84226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:27 compute-0 python3.9[84228]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759215146.1973393-354-80408034775718/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:52:27 compute-0 sudo[84226]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:28 compute-0 sudo[84378]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqnimpmykcwcxrizaktpwcgseurjkpxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215147.78025-384-184531383923443/AnsiballZ_stat.py'
Sep 30 06:52:28 compute-0 sudo[84378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:28 compute-0 python3.9[84380]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:52:28 compute-0 sudo[84378]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:28 compute-0 sudo[84503]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hiqbyywdcolrhnqdgukeaevhvhqqetop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215147.78025-384-184531383923443/AnsiballZ_copy.py'
Sep 30 06:52:28 compute-0 sudo[84503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:29 compute-0 python3.9[84505]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759215147.78025-384-184531383923443/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:52:29 compute-0 sudo[84503]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:29 compute-0 sudo[84655]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmrzuopkalmaiaebmzclgfwzyvjgjqqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215149.345343-414-142552452080154/AnsiballZ_stat.py'
Sep 30 06:52:29 compute-0 sudo[84655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:30 compute-0 python3.9[84657]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:52:30 compute-0 sudo[84655]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:30 compute-0 sudo[84780]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ugnjhdhhgldkmgdhnhkfvcdopedgwzqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215149.345343-414-142552452080154/AnsiballZ_copy.py'
Sep 30 06:52:30 compute-0 sudo[84780]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:30 compute-0 python3.9[84782]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759215149.345343-414-142552452080154/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:52:30 compute-0 sudo[84780]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:31 compute-0 sudo[84932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ggwrvmvnyunabigdzppcgimaiikzlnxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215151.0144403-444-21111555615552/AnsiballZ_file.py'
Sep 30 06:52:31 compute-0 sudo[84932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:31 compute-0 python3.9[84934]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:52:31 compute-0 sudo[84932]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:32 compute-0 sudo[85084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uowzpycnyzxnxuttuycgctmmnegvkrab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215151.7724876-460-133078518664871/AnsiballZ_command.py'
Sep 30 06:52:32 compute-0 sudo[85084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:32 compute-0 python3.9[85086]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:52:32 compute-0 sudo[85084]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:33 compute-0 sudo[85239]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rulptjntgnampabxzlznsopxbehtsmyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215152.7496998-476-58035616499732/AnsiballZ_blockinfile.py'
Sep 30 06:52:33 compute-0 sudo[85239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:33 compute-0 python3.9[85241]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:52:33 compute-0 sudo[85239]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:34 compute-0 sudo[85391]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqtnmzkarwpzucqbqrjzauemdgwqstbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215153.7150052-494-254865830361750/AnsiballZ_command.py'
Sep 30 06:52:34 compute-0 sudo[85391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:34 compute-0 python3.9[85393]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:52:34 compute-0 sudo[85391]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:34 compute-0 sudo[85544]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjyaojvpwfvzdibyrihyjgvnxqsagiih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215154.560639-510-4306095267406/AnsiballZ_stat.py'
Sep 30 06:52:34 compute-0 sudo[85544]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:35 compute-0 python3.9[85546]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 06:52:35 compute-0 sudo[85544]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:35 compute-0 sudo[85698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-suiafdetzlpenylhhojfgthkhuhirfua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215155.409575-526-106981694585876/AnsiballZ_command.py'
Sep 30 06:52:35 compute-0 sudo[85698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:36 compute-0 python3.9[85700]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:52:36 compute-0 sudo[85698]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:36 compute-0 sudo[85853]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csadhgeyckpkznyzcklrysgimasfqnkf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215156.287649-542-75920311044227/AnsiballZ_file.py'
Sep 30 06:52:36 compute-0 sudo[85853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:36 compute-0 python3.9[85855]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:52:36 compute-0 sudo[85853]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:38 compute-0 python3.9[86005]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 06:52:39 compute-0 sudo[86156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqvyjbxlhnolechrjrgtbrxeewnxcbyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215158.8340871-622-26774464838860/AnsiballZ_command.py'
Sep 30 06:52:39 compute-0 sudo[86156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:39 compute-0 python3.9[86158]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:1e:0a:74:f6:ca:ec" external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:52:39 compute-0 ovs-vsctl[86159]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-0.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:1e:0a:74:f6:ca:ec external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Sep 30 06:52:39 compute-0 sudo[86156]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:40 compute-0 sudo[86309]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ooquvbmkaullxodhpzdokhgjaingkpsk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215159.7552447-640-24301934370612/AnsiballZ_command.py'
Sep 30 06:52:40 compute-0 sudo[86309]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:40 compute-0 python3.9[86311]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                            ovs-vsctl show | grep -q "Manager"
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:52:40 compute-0 sudo[86309]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:41 compute-0 sudo[86464]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbivvrfejxpmprkuvjdjdwxudcsnlthx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215160.6081235-656-63690338853605/AnsiballZ_command.py'
Sep 30 06:52:41 compute-0 sudo[86464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:41 compute-0 python3.9[86466]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:52:41 compute-0 ovs-vsctl[86467]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Sep 30 06:52:41 compute-0 sudo[86464]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:42 compute-0 python3.9[86617]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 06:52:42 compute-0 sudo[86769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nthmxhkymkcboupbwlxavtkbglizrwiv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215162.355502-690-96691564775925/AnsiballZ_file.py'
Sep 30 06:52:42 compute-0 sudo[86769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:42 compute-0 python3.9[86771]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:52:42 compute-0 sudo[86769]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:43 compute-0 sudo[86921]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xydzsjcuochujbseldvjgytwzcsawqem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215163.1662989-706-267907431269587/AnsiballZ_stat.py'
Sep 30 06:52:43 compute-0 sudo[86921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:43 compute-0 python3.9[86923]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:52:43 compute-0 sudo[86921]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:44 compute-0 sudo[86999]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbtrpuxcmpqdjhimjzefaxuokyfjrydq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215163.1662989-706-267907431269587/AnsiballZ_file.py'
Sep 30 06:52:44 compute-0 sudo[86999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:44 compute-0 python3.9[87001]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:52:44 compute-0 sudo[86999]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:45 compute-0 sudo[87151]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-naqhyznxkyympkcqafzdjahlnxmiiwjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215164.662062-706-52858370530572/AnsiballZ_stat.py'
Sep 30 06:52:45 compute-0 sudo[87151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:45 compute-0 python3.9[87153]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:52:45 compute-0 sudo[87151]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:45 compute-0 sudo[87229]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qadsbjnyshtayrdyrerelrmjppozrkmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215164.662062-706-52858370530572/AnsiballZ_file.py'
Sep 30 06:52:45 compute-0 sudo[87229]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:45 compute-0 python3.9[87231]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:52:45 compute-0 sudo[87229]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:46 compute-0 sudo[87381]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhrnozebzimzguaqqjpmqlbfbbnsonks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215166.021874-752-102161170312682/AnsiballZ_file.py'
Sep 30 06:52:46 compute-0 sudo[87381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:46 compute-0 python3.9[87383]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:52:46 compute-0 sudo[87381]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:47 compute-0 sudo[87533]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iuuejreunnsblqssovttmnojdahvzvjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215166.8002033-768-243788173126600/AnsiballZ_stat.py'
Sep 30 06:52:47 compute-0 sudo[87533]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:47 compute-0 python3.9[87535]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:52:47 compute-0 sudo[87533]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:47 compute-0 sudo[87611]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezngebazjeispdpjbtghsqdhmcfcumkj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215166.8002033-768-243788173126600/AnsiballZ_file.py'
Sep 30 06:52:47 compute-0 sudo[87611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:48 compute-0 python3.9[87613]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:52:48 compute-0 sudo[87611]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:48 compute-0 sudo[87763]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ooggrbwbtgwkuppxtmisjfvbewwwenje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215168.2727268-792-75543716671157/AnsiballZ_stat.py'
Sep 30 06:52:48 compute-0 sudo[87763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:48 compute-0 python3.9[87765]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:52:48 compute-0 sudo[87763]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:49 compute-0 sudo[87841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-runaebxcsqvglpbogfshtoqlmpeujdfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215168.2727268-792-75543716671157/AnsiballZ_file.py'
Sep 30 06:52:49 compute-0 sudo[87841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:49 compute-0 python3.9[87843]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:52:49 compute-0 sudo[87841]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:50 compute-0 sudo[87993]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdqlbzrxbowvpjxgmdfquiqhazrnrdmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215169.8210464-816-31765270697499/AnsiballZ_systemd.py'
Sep 30 06:52:50 compute-0 sudo[87993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:50 compute-0 python3.9[87995]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 06:52:50 compute-0 systemd[1]: Reloading.
Sep 30 06:52:50 compute-0 systemd-sysv-generator[88027]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 06:52:50 compute-0 systemd-rc-local-generator[88024]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:52:50 compute-0 sudo[87993]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:51 compute-0 sudo[88182]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwyxujzvqxnklzodoancmjuomwiwapzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215171.2314894-832-29559323843818/AnsiballZ_stat.py'
Sep 30 06:52:51 compute-0 sudo[88182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:51 compute-0 python3.9[88184]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:52:51 compute-0 sudo[88182]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:52 compute-0 sudo[88260]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ciocqbmjfsghkbqbheajasyobwtehnct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215171.2314894-832-29559323843818/AnsiballZ_file.py'
Sep 30 06:52:52 compute-0 sudo[88260]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:52 compute-0 python3.9[88262]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:52:52 compute-0 sudo[88260]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:53 compute-0 sudo[88412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixqnkmigdbuyhvupvziepoddywqdbojb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215172.7062402-856-105382855966532/AnsiballZ_stat.py'
Sep 30 06:52:53 compute-0 sudo[88412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:53 compute-0 python3.9[88414]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:52:53 compute-0 sudo[88412]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:53 compute-0 sudo[88490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pebhsjxbjzypmbyydolxvvojekrkgmyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215172.7062402-856-105382855966532/AnsiballZ_file.py'
Sep 30 06:52:53 compute-0 sudo[88490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:53 compute-0 python3.9[88492]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:52:53 compute-0 sudo[88490]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:54 compute-0 sudo[88642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nslaocatnocnufhjptmsmtcvsechqnjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215174.0862699-880-15824885592906/AnsiballZ_systemd.py'
Sep 30 06:52:54 compute-0 sudo[88642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:54 compute-0 python3.9[88644]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 06:52:54 compute-0 systemd[1]: Reloading.
Sep 30 06:52:54 compute-0 systemd-rc-local-generator[88671]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:52:54 compute-0 systemd-sysv-generator[88675]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 06:52:55 compute-0 systemd[1]: Starting Create netns directory...
Sep 30 06:52:55 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Sep 30 06:52:55 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Sep 30 06:52:55 compute-0 systemd[1]: Finished Create netns directory.
Sep 30 06:52:55 compute-0 sudo[88642]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:55 compute-0 sudo[88835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yauceqjobcvgxybcrazdssjzfixhnqgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215175.6068366-900-107346901934203/AnsiballZ_file.py'
Sep 30 06:52:55 compute-0 sudo[88835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:56 compute-0 python3.9[88837]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:52:56 compute-0 sudo[88835]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:56 compute-0 sudo[88987]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csmpzjbtklyzijxrlpwmsomqpdfmoogt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215176.4800885-916-23604152961480/AnsiballZ_stat.py'
Sep 30 06:52:56 compute-0 sudo[88987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:57 compute-0 python3.9[88989]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:52:57 compute-0 sudo[88987]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:57 compute-0 sudo[89110]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhghnuftzwnvcvuceibtdlhnwdwotlhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215176.4800885-916-23604152961480/AnsiballZ_copy.py'
Sep 30 06:52:57 compute-0 sudo[89110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:57 compute-0 python3.9[89112]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759215176.4800885-916-23604152961480/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:52:57 compute-0 sudo[89110]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:58 compute-0 sudo[89262]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hluzntfksufdhwmqqcpbvbldjlamqxmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215178.2637193-950-156812385259028/AnsiballZ_file.py'
Sep 30 06:52:58 compute-0 sudo[89262]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:58 compute-0 python3.9[89264]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:52:58 compute-0 sudo[89262]: pam_unix(sudo:session): session closed for user root
Sep 30 06:52:59 compute-0 sudo[89414]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gabropnwbuxqzupnoxlejbozpbrfrjmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215179.1666317-966-185614719109976/AnsiballZ_stat.py'
Sep 30 06:52:59 compute-0 sudo[89414]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:52:59 compute-0 python3.9[89416]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:52:59 compute-0 sudo[89414]: pam_unix(sudo:session): session closed for user root
Sep 30 06:53:00 compute-0 sudo[89537]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwhppsvafkhqtscogaghmabqwkytjkoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215179.1666317-966-185614719109976/AnsiballZ_copy.py'
Sep 30 06:53:00 compute-0 sudo[89537]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:53:00 compute-0 python3.9[89539]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759215179.1666317-966-185614719109976/.source.json _original_basename=.nxoe8luw follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:53:00 compute-0 sudo[89537]: pam_unix(sudo:session): session closed for user root
Sep 30 06:53:01 compute-0 sudo[89689]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ganrucgsriebaihjbnxqjppbkqfmpmsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215180.736302-996-145727423684159/AnsiballZ_file.py'
Sep 30 06:53:01 compute-0 sudo[89689]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:53:01 compute-0 python3.9[89691]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:53:01 compute-0 sudo[89689]: pam_unix(sudo:session): session closed for user root
Sep 30 06:53:02 compute-0 sudo[89841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmenochtintfasufngflklhsvwxpkjed ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215181.7237394-1012-114069430445103/AnsiballZ_stat.py'
Sep 30 06:53:02 compute-0 sudo[89841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:53:02 compute-0 sudo[89841]: pam_unix(sudo:session): session closed for user root
Sep 30 06:53:02 compute-0 sudo[89966]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ogmptlujhpqztnifqdsotnxkjqjdsbrt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215181.7237394-1012-114069430445103/AnsiballZ_copy.py'
Sep 30 06:53:02 compute-0 sudo[89966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:53:02 compute-0 sshd-session[89844]: Invalid user admin from 194.0.234.19 port 19004
Sep 30 06:53:03 compute-0 sudo[89966]: pam_unix(sudo:session): session closed for user root
Sep 30 06:53:03 compute-0 sshd-session[89844]: Connection closed by invalid user admin 194.0.234.19 port 19004 [preauth]
Sep 30 06:53:03 compute-0 PackageKit[31481]: daemon quit
Sep 30 06:53:03 compute-0 systemd[1]: packagekit.service: Deactivated successfully.
Sep 30 06:53:03 compute-0 sudo[90118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqgatmmczwoqxlaehxghagjmrewirvau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215183.409974-1046-265731777550385/AnsiballZ_container_config_data.py'
Sep 30 06:53:03 compute-0 sudo[90118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:53:04 compute-0 python3.9[90120]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Sep 30 06:53:04 compute-0 sudo[90118]: pam_unix(sudo:session): session closed for user root
Sep 30 06:53:04 compute-0 sudo[90270]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwzicpxfdazfhvxdzkudrlmuzjwgvikn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215184.4545894-1064-62694910003319/AnsiballZ_container_config_hash.py'
Sep 30 06:53:04 compute-0 sudo[90270]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:53:05 compute-0 python3.9[90272]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Sep 30 06:53:05 compute-0 sudo[90270]: pam_unix(sudo:session): session closed for user root
Sep 30 06:53:06 compute-0 sudo[90422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdhinljqlukvuctzeznuhysiuianhwcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215185.5379994-1082-21969068093693/AnsiballZ_podman_container_info.py'
Sep 30 06:53:06 compute-0 sudo[90422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:53:06 compute-0 python3.9[90424]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Sep 30 06:53:06 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 06:53:06 compute-0 sudo[90422]: pam_unix(sudo:session): session closed for user root
Sep 30 06:53:07 compute-0 sshd-session[90425]: Invalid user maill from 152.32.253.152 port 45368
Sep 30 06:53:07 compute-0 sshd-session[90425]: Received disconnect from 152.32.253.152 port 45368:11: Bye Bye [preauth]
Sep 30 06:53:07 compute-0 sshd-session[90425]: Disconnected from invalid user maill 152.32.253.152 port 45368 [preauth]
Sep 30 06:53:07 compute-0 sudo[90586]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-estrudebfdxunrwsemaalgiiqrgrguff ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759215186.9469554-1108-248952617857823/AnsiballZ_edpm_container_manage.py'
Sep 30 06:53:07 compute-0 sudo[90586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:53:07 compute-0 python3[90588]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Sep 30 06:53:07 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 06:53:08 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 06:53:08 compute-0 podman[90625]: 2025-09-30 06:53:08.172334054 +0000 UTC m=+0.077557961 container create cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930)
Sep 30 06:53:08 compute-0 podman[90625]: 2025-09-30 06:53:08.13288758 +0000 UTC m=+0.038111547 image pull cde2f83a5b58d30e0d6b5c078f3d1ef55892b021d0701b493b9597be9f28e4aa 38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest
Sep 30 06:53:08 compute-0 python3[90588]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z 38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest
Sep 30 06:53:08 compute-0 sudo[90586]: pam_unix(sudo:session): session closed for user root
Sep 30 06:53:08 compute-0 sudo[90812]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zojhggrskncovecqovgqllhickxgdpgo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215188.5433445-1124-229599391764526/AnsiballZ_stat.py'
Sep 30 06:53:08 compute-0 sudo[90812]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:53:08 compute-0 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Sep 30 06:53:09 compute-0 python3.9[90814]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 06:53:09 compute-0 sudo[90812]: pam_unix(sudo:session): session closed for user root
Sep 30 06:53:09 compute-0 sudo[90966]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghostckzldxjksvnmpalqxwqndwaprtm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215189.481583-1142-168121844560648/AnsiballZ_file.py'
Sep 30 06:53:09 compute-0 sudo[90966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:53:10 compute-0 python3.9[90968]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:53:10 compute-0 sudo[90966]: pam_unix(sudo:session): session closed for user root
Sep 30 06:53:10 compute-0 sudo[91042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvaucjlyjygnaxpmgqfqxtpihaszjszt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215189.481583-1142-168121844560648/AnsiballZ_stat.py'
Sep 30 06:53:10 compute-0 sudo[91042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:53:10 compute-0 python3.9[91044]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 06:53:10 compute-0 sudo[91042]: pam_unix(sudo:session): session closed for user root
Sep 30 06:53:11 compute-0 sudo[91193]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzksxdprldywckfvbhxnfxuyrqlfazlm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215190.6934357-1142-263215303872981/AnsiballZ_copy.py'
Sep 30 06:53:11 compute-0 sudo[91193]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:53:11 compute-0 python3.9[91195]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759215190.6934357-1142-263215303872981/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:53:11 compute-0 sudo[91193]: pam_unix(sudo:session): session closed for user root
Sep 30 06:53:11 compute-0 sudo[91269]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrojcvkfqtyxboghtgurcjzdghogteqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215190.6934357-1142-263215303872981/AnsiballZ_systemd.py'
Sep 30 06:53:11 compute-0 sudo[91269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:53:12 compute-0 python3.9[91271]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 06:53:12 compute-0 systemd[1]: Reloading.
Sep 30 06:53:12 compute-0 systemd-sysv-generator[91300]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 06:53:12 compute-0 systemd-rc-local-generator[91297]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:53:12 compute-0 sudo[91269]: pam_unix(sudo:session): session closed for user root
Sep 30 06:53:12 compute-0 sudo[91379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmwmtwtgselttadymzlpiookdahmwmxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215190.6934357-1142-263215303872981/AnsiballZ_systemd.py'
Sep 30 06:53:12 compute-0 sudo[91379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:53:13 compute-0 python3.9[91381]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 06:53:13 compute-0 systemd[1]: Reloading.
Sep 30 06:53:13 compute-0 systemd-sysv-generator[91408]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 06:53:13 compute-0 systemd-rc-local-generator[91404]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:53:13 compute-0 systemd[1]: Starting ovn_controller container...
Sep 30 06:53:13 compute-0 systemd[1]: Created slice Virtual Machine and Container Slice.
Sep 30 06:53:13 compute-0 systemd[1]: Started libcrun container.
Sep 30 06:53:13 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/753449065fe3e4ef5859e952a91241384edfcc2d296ac3811235953b3938705c/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Sep 30 06:53:13 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc.
Sep 30 06:53:13 compute-0 podman[91421]: 2025-09-30 06:53:13.694548405 +0000 UTC m=+0.183355869 container init cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Sep 30 06:53:13 compute-0 ovn_controller[91436]: + sudo -E kolla_set_configs
Sep 30 06:53:13 compute-0 podman[91421]: 2025-09-30 06:53:13.724180212 +0000 UTC m=+0.212987636 container start cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Sep 30 06:53:13 compute-0 edpm-start-podman-container[91421]: ovn_controller
Sep 30 06:53:13 compute-0 systemd[1]: Created slice User Slice of UID 0.
Sep 30 06:53:13 compute-0 systemd[1]: Starting User Runtime Directory /run/user/0...
Sep 30 06:53:13 compute-0 systemd[1]: Finished User Runtime Directory /run/user/0.
Sep 30 06:53:13 compute-0 systemd[1]: Starting User Manager for UID 0...
Sep 30 06:53:13 compute-0 edpm-start-podman-container[91420]: Creating additional drop-in dependency for "ovn_controller" (cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc)
Sep 30 06:53:13 compute-0 systemd[91474]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Sep 30 06:53:13 compute-0 podman[91443]: 2025-09-30 06:53:13.850027855 +0000 UTC m=+0.107442825 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Sep 30 06:53:13 compute-0 systemd[1]: Reloading.
Sep 30 06:53:13 compute-0 systemd-rc-local-generator[91522]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:53:13 compute-0 systemd-sysv-generator[91525]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 06:53:13 compute-0 systemd[91474]: Queued start job for default target Main User Target.
Sep 30 06:53:14 compute-0 systemd[91474]: Created slice User Application Slice.
Sep 30 06:53:14 compute-0 systemd[91474]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Sep 30 06:53:14 compute-0 systemd[91474]: Started Daily Cleanup of User's Temporary Directories.
Sep 30 06:53:14 compute-0 systemd[91474]: Reached target Paths.
Sep 30 06:53:14 compute-0 systemd[91474]: Reached target Timers.
Sep 30 06:53:14 compute-0 systemd[91474]: Starting D-Bus User Message Bus Socket...
Sep 30 06:53:14 compute-0 systemd[91474]: Starting Create User's Volatile Files and Directories...
Sep 30 06:53:14 compute-0 systemd[91474]: Listening on D-Bus User Message Bus Socket.
Sep 30 06:53:14 compute-0 systemd[91474]: Reached target Sockets.
Sep 30 06:53:14 compute-0 systemd[91474]: Finished Create User's Volatile Files and Directories.
Sep 30 06:53:14 compute-0 systemd[91474]: Reached target Basic System.
Sep 30 06:53:14 compute-0 systemd[91474]: Reached target Main User Target.
Sep 30 06:53:14 compute-0 systemd[91474]: Startup finished in 177ms.
Sep 30 06:53:14 compute-0 systemd[1]: Started User Manager for UID 0.
Sep 30 06:53:14 compute-0 systemd[1]: Started ovn_controller container.
Sep 30 06:53:14 compute-0 systemd[1]: cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc-7d81854ce0dc13e0.service: Main process exited, code=exited, status=1/FAILURE
Sep 30 06:53:14 compute-0 systemd[1]: cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc-7d81854ce0dc13e0.service: Failed with result 'exit-code'.
Sep 30 06:53:14 compute-0 systemd[1]: Started Session c1 of User root.
Sep 30 06:53:14 compute-0 sudo[91379]: pam_unix(sudo:session): session closed for user root
Sep 30 06:53:14 compute-0 ovn_controller[91436]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Sep 30 06:53:14 compute-0 ovn_controller[91436]: INFO:__main__:Validating config file
Sep 30 06:53:14 compute-0 ovn_controller[91436]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Sep 30 06:53:14 compute-0 ovn_controller[91436]: INFO:__main__:Writing out command to execute
Sep 30 06:53:14 compute-0 systemd[1]: session-c1.scope: Deactivated successfully.
Sep 30 06:53:14 compute-0 ovn_controller[91436]: ++ cat /run_command
Sep 30 06:53:14 compute-0 ovn_controller[91436]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Sep 30 06:53:14 compute-0 ovn_controller[91436]: + ARGS=
Sep 30 06:53:14 compute-0 ovn_controller[91436]: + sudo kolla_copy_cacerts
Sep 30 06:53:14 compute-0 systemd[1]: Started Session c2 of User root.
Sep 30 06:53:14 compute-0 systemd[1]: session-c2.scope: Deactivated successfully.
Sep 30 06:53:14 compute-0 ovn_controller[91436]: + [[ ! -n '' ]]
Sep 30 06:53:14 compute-0 ovn_controller[91436]: + . kolla_extend_start
Sep 30 06:53:14 compute-0 ovn_controller[91436]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Sep 30 06:53:14 compute-0 ovn_controller[91436]: + umask 0022
Sep 30 06:53:14 compute-0 ovn_controller[91436]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Sep 30 06:53:14 compute-0 ovn_controller[91436]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Sep 30 06:53:14 compute-0 ovn_controller[91436]: 2025-09-30T06:53:14Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Sep 30 06:53:14 compute-0 ovn_controller[91436]: 2025-09-30T06:53:14Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Sep 30 06:53:14 compute-0 ovn_controller[91436]: 2025-09-30T06:53:14Z|00003|main|INFO|OVN internal version is : [24.09.4-20.37.0-77.8]
Sep 30 06:53:14 compute-0 ovn_controller[91436]: 2025-09-30T06:53:14Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Sep 30 06:53:14 compute-0 ovn_controller[91436]: 2025-09-30T06:53:14Z|00005|stream_ssl|ERR|ssl:ovsdbserver-sb.openstack.svc:6642: connect: Address family not supported by protocol
Sep 30 06:53:14 compute-0 ovn_controller[91436]: 2025-09-30T06:53:14Z|00006|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Sep 30 06:53:14 compute-0 ovn_controller[91436]: 2025-09-30T06:53:14Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connection attempt failed (Address family not supported by protocol)
Sep 30 06:53:14 compute-0 ovn_controller[91436]: 2025-09-30T06:53:14Z|00008|main|INFO|OVNSB IDL reconnected, force recompute.
Sep 30 06:53:14 compute-0 ovn_controller[91436]: 2025-09-30T06:53:14Z|00009|ovn_util|INFO|statctrl: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Sep 30 06:53:14 compute-0 ovn_controller[91436]: 2025-09-30T06:53:14Z|00010|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Sep 30 06:53:14 compute-0 ovn_controller[91436]: 2025-09-30T06:53:14Z|00011|rconn|WARN|unix:/var/run/openvswitch/br-int.mgmt: connection failed (No such file or directory)
Sep 30 06:53:14 compute-0 ovn_controller[91436]: 2025-09-30T06:53:14Z|00012|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: waiting 1 seconds before reconnect
Sep 30 06:53:14 compute-0 ovn_controller[91436]: 2025-09-30T06:53:14Z|00013|ovn_util|INFO|pinctrl: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Sep 30 06:53:14 compute-0 ovn_controller[91436]: 2025-09-30T06:53:14Z|00014|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Sep 30 06:53:14 compute-0 ovn_controller[91436]: 2025-09-30T06:53:14Z|00015|rconn|WARN|unix:/var/run/openvswitch/br-int.mgmt: connection failed (No such file or directory)
Sep 30 06:53:14 compute-0 ovn_controller[91436]: 2025-09-30T06:53:14Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: waiting 1 seconds before reconnect
Sep 30 06:53:14 compute-0 NetworkManager[51813]: <info>  [1759215194.3778] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Sep 30 06:53:14 compute-0 NetworkManager[51813]: <info>  [1759215194.3788] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Sep 30 06:53:14 compute-0 NetworkManager[51813]: <info>  [1759215194.3805] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Sep 30 06:53:14 compute-0 NetworkManager[51813]: <info>  [1759215194.3814] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Sep 30 06:53:14 compute-0 NetworkManager[51813]: <info>  [1759215194.3819] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Sep 30 06:53:14 compute-0 kernel: br-int: entered promiscuous mode
Sep 30 06:53:14 compute-0 systemd-udevd[91593]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 06:53:14 compute-0 sudo[91698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mpfjwgtjfqcgleropntqmmllrmyrnlyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215194.425185-1198-35716146683277/AnsiballZ_command.py'
Sep 30 06:53:14 compute-0 sudo[91698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:53:15 compute-0 python3.9[91700]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:53:15 compute-0 ovs-vsctl[91701]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Sep 30 06:53:15 compute-0 sudo[91698]: pam_unix(sudo:session): session closed for user root
Sep 30 06:53:15 compute-0 ovn_controller[91436]: 2025-09-30T06:53:15Z|00001|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Sep 30 06:53:15 compute-0 ovn_controller[91436]: 2025-09-30T06:53:15Z|00001|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Sep 30 06:53:15 compute-0 ovn_controller[91436]: 2025-09-30T06:53:15Z|00017|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Sep 30 06:53:15 compute-0 ovn_controller[91436]: 2025-09-30T06:53:15Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Sep 30 06:53:15 compute-0 ovn_controller[91436]: 2025-09-30T06:53:15Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Sep 30 06:53:15 compute-0 ovn_controller[91436]: 2025-09-30T06:53:15Z|00018|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Sep 30 06:53:15 compute-0 ovn_controller[91436]: 2025-09-30T06:53:15Z|00019|ovn_util|INFO|features: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Sep 30 06:53:15 compute-0 ovn_controller[91436]: 2025-09-30T06:53:15Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Sep 30 06:53:15 compute-0 ovn_controller[91436]: 2025-09-30T06:53:15Z|00021|features|INFO|OVS Feature: ct_zero_snat, state: supported
Sep 30 06:53:15 compute-0 ovn_controller[91436]: 2025-09-30T06:53:15Z|00022|features|INFO|OVS Feature: ct_flush, state: supported
Sep 30 06:53:15 compute-0 ovn_controller[91436]: 2025-09-30T06:53:15Z|00023|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Sep 30 06:53:15 compute-0 ovn_controller[91436]: 2025-09-30T06:53:15Z|00024|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Sep 30 06:53:15 compute-0 ovn_controller[91436]: 2025-09-30T06:53:15Z|00025|main|INFO|OVS feature set changed, force recompute.
Sep 30 06:53:15 compute-0 ovn_controller[91436]: 2025-09-30T06:53:15Z|00026|ovn_util|INFO|ofctrl: connecting to switch: "unix:/var/run/openvswitch/br-int.mgmt"
Sep 30 06:53:15 compute-0 ovn_controller[91436]: 2025-09-30T06:53:15Z|00027|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Sep 30 06:53:15 compute-0 ovn_controller[91436]: 2025-09-30T06:53:15Z|00028|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Sep 30 06:53:15 compute-0 ovn_controller[91436]: 2025-09-30T06:53:15Z|00029|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Sep 30 06:53:15 compute-0 ovn_controller[91436]: 2025-09-30T06:53:15Z|00030|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Sep 30 06:53:15 compute-0 ovn_controller[91436]: 2025-09-30T06:53:15Z|00031|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Sep 30 06:53:15 compute-0 ovn_controller[91436]: 2025-09-30T06:53:15Z|00032|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Sep 30 06:53:15 compute-0 ovn_controller[91436]: 2025-09-30T06:53:15Z|00033|features|INFO|OVS Feature: meter_support, state: supported
Sep 30 06:53:15 compute-0 ovn_controller[91436]: 2025-09-30T06:53:15Z|00034|features|INFO|OVS Feature: group_support, state: supported
Sep 30 06:53:15 compute-0 ovn_controller[91436]: 2025-09-30T06:53:15Z|00035|main|INFO|OVS feature set changed, force recompute.
Sep 30 06:53:15 compute-0 ovn_controller[91436]: 2025-09-30T06:53:15Z|00036|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Sep 30 06:53:15 compute-0 ovn_controller[91436]: 2025-09-30T06:53:15Z|00037|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Sep 30 06:53:15 compute-0 NetworkManager[51813]: <info>  [1759215195.4405] manager: (ovn-f92db7-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Sep 30 06:53:15 compute-0 NetworkManager[51813]: <info>  [1759215195.4419] manager: (ovn-8a9138-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/18)
Sep 30 06:53:15 compute-0 kernel: genev_sys_6081: entered promiscuous mode
Sep 30 06:53:15 compute-0 systemd-udevd[91603]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 06:53:15 compute-0 NetworkManager[51813]: <info>  [1759215195.4705] device (genev_sys_6081): carrier: link connected
Sep 30 06:53:15 compute-0 NetworkManager[51813]: <info>  [1759215195.4711] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/19)
Sep 30 06:53:15 compute-0 sudo[91854]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgxtpnponpxcufvttjbchxsjiyszjivb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215195.2951934-1214-207028933303514/AnsiballZ_command.py'
Sep 30 06:53:15 compute-0 sudo[91854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:53:15 compute-0 python3.9[91856]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:53:15 compute-0 ovs-vsctl[91858]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Sep 30 06:53:15 compute-0 sudo[91854]: pam_unix(sudo:session): session closed for user root
Sep 30 06:53:16 compute-0 sudo[92009]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qljcfxnqhwwqehsjbvwpvxjkulfdhspr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215196.503087-1242-188256691401356/AnsiballZ_command.py'
Sep 30 06:53:16 compute-0 sudo[92009]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:53:17 compute-0 python3.9[92011]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:53:17 compute-0 ovs-vsctl[92012]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Sep 30 06:53:17 compute-0 sudo[92009]: pam_unix(sudo:session): session closed for user root
Sep 30 06:53:17 compute-0 sshd-session[80942]: Connection closed by 192.168.122.30 port 44894
Sep 30 06:53:17 compute-0 sshd-session[80939]: pam_unix(sshd:session): session closed for user zuul
Sep 30 06:53:17 compute-0 systemd-logind[824]: Session 21 logged out. Waiting for processes to exit.
Sep 30 06:53:17 compute-0 systemd[1]: session-21.scope: Deactivated successfully.
Sep 30 06:53:17 compute-0 systemd[1]: session-21.scope: Consumed 56.718s CPU time.
Sep 30 06:53:17 compute-0 systemd-logind[824]: Removed session 21.
Sep 30 06:53:23 compute-0 sshd-session[92037]: Accepted publickey for zuul from 192.168.122.30 port 39996 ssh2: ECDSA SHA256:VgXY+3KEFg6ByVjpOVk/qpSKqXtLqTtx1W0gQMfs9wE
Sep 30 06:53:23 compute-0 systemd-logind[824]: New session 23 of user zuul.
Sep 30 06:53:23 compute-0 systemd[1]: Started Session 23 of User zuul.
Sep 30 06:53:23 compute-0 sshd-session[92037]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 06:53:24 compute-0 python3.9[92190]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 06:53:24 compute-0 systemd[1]: Stopping User Manager for UID 0...
Sep 30 06:53:24 compute-0 systemd[91474]: Activating special unit Exit the Session...
Sep 30 06:53:24 compute-0 systemd[91474]: Stopped target Main User Target.
Sep 30 06:53:24 compute-0 systemd[91474]: Stopped target Basic System.
Sep 30 06:53:24 compute-0 systemd[91474]: Stopped target Paths.
Sep 30 06:53:24 compute-0 systemd[91474]: Stopped target Sockets.
Sep 30 06:53:24 compute-0 systemd[91474]: Stopped target Timers.
Sep 30 06:53:24 compute-0 systemd[91474]: Stopped Daily Cleanup of User's Temporary Directories.
Sep 30 06:53:24 compute-0 systemd[91474]: Closed D-Bus User Message Bus Socket.
Sep 30 06:53:24 compute-0 systemd[91474]: Stopped Create User's Volatile Files and Directories.
Sep 30 06:53:24 compute-0 systemd[91474]: Removed slice User Application Slice.
Sep 30 06:53:24 compute-0 systemd[91474]: Reached target Shutdown.
Sep 30 06:53:24 compute-0 systemd[91474]: Finished Exit the Session.
Sep 30 06:53:24 compute-0 systemd[91474]: Reached target Exit the Session.
Sep 30 06:53:24 compute-0 systemd[1]: user@0.service: Deactivated successfully.
Sep 30 06:53:24 compute-0 systemd[1]: Stopped User Manager for UID 0.
Sep 30 06:53:24 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/0...
Sep 30 06:53:24 compute-0 systemd[1]: run-user-0.mount: Deactivated successfully.
Sep 30 06:53:24 compute-0 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Sep 30 06:53:24 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/0.
Sep 30 06:53:24 compute-0 systemd[1]: Removed slice User Slice of UID 0.
Sep 30 06:53:25 compute-0 sudo[92345]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udlupquojtfcdpcwyipodegjhoyaycva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215204.8311925-48-88425360976482/AnsiballZ_file.py'
Sep 30 06:53:25 compute-0 sudo[92345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:53:25 compute-0 python3.9[92347]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:53:25 compute-0 sudo[92345]: pam_unix(sudo:session): session closed for user root
Sep 30 06:53:26 compute-0 sudo[92497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqwgfdwfpuzalfbkgsxsqaxuvnkvwzlk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215205.7545812-48-217205967091262/AnsiballZ_file.py'
Sep 30 06:53:26 compute-0 sudo[92497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:53:26 compute-0 python3.9[92499]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:53:26 compute-0 sudo[92497]: pam_unix(sudo:session): session closed for user root
Sep 30 06:53:26 compute-0 sudo[92649]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvpdnqvodmnairyrnpxyzjlwbjeaxcgc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215206.5936291-48-160305495317831/AnsiballZ_file.py'
Sep 30 06:53:26 compute-0 sudo[92649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:53:27 compute-0 python3.9[92651]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:53:27 compute-0 sudo[92649]: pam_unix(sudo:session): session closed for user root
Sep 30 06:53:27 compute-0 sudo[92801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-laavuskgjwufmetxzwokuhintywftwhv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215207.363994-48-23659582726002/AnsiballZ_file.py'
Sep 30 06:53:27 compute-0 sudo[92801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:53:28 compute-0 python3.9[92803]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:53:28 compute-0 sudo[92801]: pam_unix(sudo:session): session closed for user root
Sep 30 06:53:28 compute-0 sudo[92953]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwwgyrynaafyfhqeulntsckohecspwrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215208.2656412-48-45351940779349/AnsiballZ_file.py'
Sep 30 06:53:28 compute-0 sudo[92953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:53:28 compute-0 python3.9[92955]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:53:28 compute-0 sudo[92953]: pam_unix(sudo:session): session closed for user root
Sep 30 06:53:29 compute-0 python3.9[93105]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 06:53:30 compute-0 sudo[93255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwmshbkzsmmcpnrpwpiggoybpqqgakmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215209.9463098-136-226666747294339/AnsiballZ_seboolean.py'
Sep 30 06:53:30 compute-0 sudo[93255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:53:30 compute-0 python3.9[93257]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Sep 30 06:53:31 compute-0 sudo[93255]: pam_unix(sudo:session): session closed for user root
Sep 30 06:53:32 compute-0 python3.9[93407]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:53:32 compute-0 python3.9[93528]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759215211.5157282-152-168670726915196/.source follow=False _original_basename=haproxy.j2 checksum=e770ff414b0fadca51d134a12efcc6c9b048ec99 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:53:33 compute-0 python3.9[93679]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:53:34 compute-0 python3.9[93800]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759215213.1318066-182-228307430458580/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:53:35 compute-0 sudo[93950]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifepvthslhlnkjqimajsmrhkgutaafau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215214.8229392-216-212565397847579/AnsiballZ_setup.py'
Sep 30 06:53:35 compute-0 sudo[93950]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:53:35 compute-0 python3.9[93952]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 06:53:35 compute-0 sudo[93950]: pam_unix(sudo:session): session closed for user root
Sep 30 06:53:36 compute-0 sudo[94034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywchwknhaylfbphkrhdvkbbtqcxvbsaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215214.8229392-216-212565397847579/AnsiballZ_dnf.py'
Sep 30 06:53:36 compute-0 sudo[94034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:53:36 compute-0 python3.9[94036]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 06:53:37 compute-0 sudo[94034]: pam_unix(sudo:session): session closed for user root
Sep 30 06:53:38 compute-0 sudo[94187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-calhztblcwkexffkyuvhlasvpddoxvyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215217.9745781-240-241258759276360/AnsiballZ_systemd.py'
Sep 30 06:53:38 compute-0 sudo[94187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:53:39 compute-0 python3.9[94189]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Sep 30 06:53:39 compute-0 sudo[94187]: pam_unix(sudo:session): session closed for user root
Sep 30 06:53:39 compute-0 python3.9[94342]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:53:40 compute-0 python3.9[94463]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759215219.3920114-256-12052589950751/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:53:41 compute-0 python3.9[94613]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:53:42 compute-0 python3.9[94734]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759215220.8515363-256-100658540083856/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:53:43 compute-0 python3.9[94884]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:53:44 compute-0 ovn_controller[91436]: 2025-09-30T06:53:44Z|00038|memory|INFO|15816 kB peak resident set size after 30.0 seconds
Sep 30 06:53:44 compute-0 ovn_controller[91436]: 2025-09-30T06:53:44Z|00039|memory|INFO|idl-cells-OVN_Southbound:256 idl-cells-Open_vSwitch:528 ofctrl_desired_flow_usage-KB:6 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:2
Sep 30 06:53:44 compute-0 podman[94979]: 2025-09-30 06:53:44.34764613 +0000 UTC m=+0.149302893 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Sep 30 06:53:44 compute-0 python3.9[95016]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759215223.0793812-344-91132633733819/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:53:45 compute-0 python3.9[95181]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:53:45 compute-0 python3.9[95302]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759215224.676189-344-127455533076261/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:53:46 compute-0 python3.9[95452]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 06:53:47 compute-0 sudo[95604]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgjjqpxyyeypxybidhaaptathvjjyudj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215226.994666-420-151337782366653/AnsiballZ_file.py'
Sep 30 06:53:47 compute-0 sudo[95604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:53:47 compute-0 python3.9[95606]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:53:47 compute-0 sudo[95604]: pam_unix(sudo:session): session closed for user root
Sep 30 06:53:48 compute-0 sudo[95756]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcpquvymojrmvhecrdhhnsrcykloebtu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215227.9035213-436-211581093606112/AnsiballZ_stat.py'
Sep 30 06:53:48 compute-0 sudo[95756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:53:48 compute-0 python3.9[95758]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:53:48 compute-0 sudo[95756]: pam_unix(sudo:session): session closed for user root
Sep 30 06:53:48 compute-0 sudo[95834]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijvpmftjzjniqvndtdypbdlcvtewlvlh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215227.9035213-436-211581093606112/AnsiballZ_file.py'
Sep 30 06:53:48 compute-0 sudo[95834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:53:49 compute-0 python3.9[95836]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:53:49 compute-0 sudo[95834]: pam_unix(sudo:session): session closed for user root
Sep 30 06:53:49 compute-0 sudo[95986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-isoldvuarjlmlwachkfizbasuvqwlvbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215229.2772973-436-248343280529437/AnsiballZ_stat.py'
Sep 30 06:53:49 compute-0 sudo[95986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:53:49 compute-0 python3.9[95988]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:53:49 compute-0 sudo[95986]: pam_unix(sudo:session): session closed for user root
Sep 30 06:53:50 compute-0 sudo[96064]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nnbzsfppkluynrwudpgutkrlcdivjshp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215229.2772973-436-248343280529437/AnsiballZ_file.py'
Sep 30 06:53:50 compute-0 sudo[96064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:53:50 compute-0 python3.9[96066]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:53:50 compute-0 sudo[96064]: pam_unix(sudo:session): session closed for user root
Sep 30 06:53:51 compute-0 sudo[96216]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knbszceeyxftaezvldasrxhrhuyxtfhp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215230.620803-482-239789403046669/AnsiballZ_file.py'
Sep 30 06:53:51 compute-0 sudo[96216]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:53:51 compute-0 python3.9[96218]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:53:51 compute-0 sudo[96216]: pam_unix(sudo:session): session closed for user root
Sep 30 06:53:51 compute-0 sudo[96368]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilwtmqqokaqalvwgpfnkxdkfvwcffzka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215231.454539-498-139046398620825/AnsiballZ_stat.py'
Sep 30 06:53:51 compute-0 sudo[96368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:53:52 compute-0 python3.9[96370]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:53:52 compute-0 sudo[96368]: pam_unix(sudo:session): session closed for user root
Sep 30 06:53:52 compute-0 sudo[96446]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqruleqanvbyvsbbjxccajgukqdxjaga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215231.454539-498-139046398620825/AnsiballZ_file.py'
Sep 30 06:53:52 compute-0 sudo[96446]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:53:52 compute-0 python3.9[96448]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:53:52 compute-0 sudo[96446]: pam_unix(sudo:session): session closed for user root
Sep 30 06:53:53 compute-0 sudo[96598]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkpluovgkfftwjykyzynmmazjxoljvmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215232.9276235-522-140104223784201/AnsiballZ_stat.py'
Sep 30 06:53:53 compute-0 sudo[96598]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:53:53 compute-0 python3.9[96600]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:53:53 compute-0 sudo[96598]: pam_unix(sudo:session): session closed for user root
Sep 30 06:53:53 compute-0 sudo[96676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wognrjshywxdzwnhhjsybhwsjgkogjbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215232.9276235-522-140104223784201/AnsiballZ_file.py'
Sep 30 06:53:53 compute-0 sudo[96676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:53:54 compute-0 python3.9[96678]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:53:54 compute-0 sudo[96676]: pam_unix(sudo:session): session closed for user root
Sep 30 06:53:54 compute-0 sudo[96828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhimgnrfmznmphonypuxkvpjzrbmqapl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215234.3601277-546-8254665868282/AnsiballZ_systemd.py'
Sep 30 06:53:54 compute-0 sudo[96828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:53:55 compute-0 python3.9[96830]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 06:53:55 compute-0 systemd[1]: Reloading.
Sep 30 06:53:55 compute-0 systemd-rc-local-generator[96860]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:53:55 compute-0 systemd-sysv-generator[96865]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 06:53:55 compute-0 sudo[96828]: pam_unix(sudo:session): session closed for user root
Sep 30 06:53:56 compute-0 sudo[97018]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qapkrkripclucklcjhtvtuuxijlbgnly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215235.648658-562-221852458327578/AnsiballZ_stat.py'
Sep 30 06:53:56 compute-0 sudo[97018]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:53:56 compute-0 python3.9[97020]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:53:56 compute-0 sudo[97018]: pam_unix(sudo:session): session closed for user root
Sep 30 06:53:56 compute-0 sudo[97096]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqdkhmrfvwjqqyuspuiysgozfygobhiw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215235.648658-562-221852458327578/AnsiballZ_file.py'
Sep 30 06:53:56 compute-0 sudo[97096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:53:56 compute-0 python3.9[97098]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:53:56 compute-0 sudo[97096]: pam_unix(sudo:session): session closed for user root
Sep 30 06:53:57 compute-0 sudo[97248]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nnoapusdwefdkhxpdblmsibjwhvcdmbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215237.0894861-586-20040322070149/AnsiballZ_stat.py'
Sep 30 06:53:57 compute-0 sudo[97248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:53:57 compute-0 python3.9[97250]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:53:57 compute-0 sudo[97248]: pam_unix(sudo:session): session closed for user root
Sep 30 06:53:58 compute-0 sudo[97326]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cpuogkaaevbuxbjrxhruypwvvtfztppd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215237.0894861-586-20040322070149/AnsiballZ_file.py'
Sep 30 06:53:58 compute-0 sudo[97326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:53:58 compute-0 python3.9[97328]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:53:58 compute-0 sudo[97326]: pam_unix(sudo:session): session closed for user root
Sep 30 06:53:58 compute-0 sudo[97478]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wwwggaganwpucjnwgvykrvzhdnbysyyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215238.5327-610-123213479363031/AnsiballZ_systemd.py'
Sep 30 06:53:58 compute-0 sudo[97478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:53:59 compute-0 python3.9[97480]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 06:53:59 compute-0 systemd[1]: Reloading.
Sep 30 06:53:59 compute-0 systemd-rc-local-generator[97506]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:53:59 compute-0 systemd-sysv-generator[97511]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 06:53:59 compute-0 systemd[1]: Starting Create netns directory...
Sep 30 06:53:59 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Sep 30 06:53:59 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Sep 30 06:53:59 compute-0 systemd[1]: Finished Create netns directory.
Sep 30 06:53:59 compute-0 sudo[97478]: pam_unix(sudo:session): session closed for user root
Sep 30 06:54:00 compute-0 sudo[97670]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbbensmovpcoapkbcxgarvkqtutmafgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215239.9371982-630-117567045018234/AnsiballZ_file.py'
Sep 30 06:54:00 compute-0 sudo[97670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:54:00 compute-0 python3.9[97672]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:54:00 compute-0 sudo[97670]: pam_unix(sudo:session): session closed for user root
Sep 30 06:54:01 compute-0 sudo[97822]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scuohubfxitwtqcvidcoynlmqqqtgvpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215240.8366373-646-275840744484184/AnsiballZ_stat.py'
Sep 30 06:54:01 compute-0 sudo[97822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:54:01 compute-0 python3.9[97824]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:54:01 compute-0 sudo[97822]: pam_unix(sudo:session): session closed for user root
Sep 30 06:54:01 compute-0 sudo[97945]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdzufrovaswwiuvvgvkrfpyfnomyazmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215240.8366373-646-275840744484184/AnsiballZ_copy.py'
Sep 30 06:54:01 compute-0 sudo[97945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:54:02 compute-0 python3.9[97947]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759215240.8366373-646-275840744484184/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:54:02 compute-0 sudo[97945]: pam_unix(sudo:session): session closed for user root
Sep 30 06:54:03 compute-0 sudo[98097]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlqzujldqrieyyzdqlxxnpqcaldkhgvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215242.7034757-680-170932220549516/AnsiballZ_file.py'
Sep 30 06:54:03 compute-0 sudo[98097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:54:03 compute-0 python3.9[98099]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:54:03 compute-0 sudo[98097]: pam_unix(sudo:session): session closed for user root
Sep 30 06:54:03 compute-0 sudo[98249]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzhuaonvcjldiyigaokvajgxtuiylzpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215243.5276356-696-83920425039301/AnsiballZ_stat.py'
Sep 30 06:54:03 compute-0 sudo[98249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:54:04 compute-0 python3.9[98251]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:54:04 compute-0 sudo[98249]: pam_unix(sudo:session): session closed for user root
Sep 30 06:54:04 compute-0 sudo[98372]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdvzmztjzgwlonzagjncfaumzskltyvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215243.5276356-696-83920425039301/AnsiballZ_copy.py'
Sep 30 06:54:04 compute-0 sudo[98372]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:54:04 compute-0 python3.9[98374]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759215243.5276356-696-83920425039301/.source.json _original_basename=.xf9lwby1 follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:54:04 compute-0 sudo[98372]: pam_unix(sudo:session): session closed for user root
Sep 30 06:54:05 compute-0 sudo[98524]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zefpeytaqnzzvdehagpenbgvetfeuhyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215245.099573-726-33158218088143/AnsiballZ_file.py'
Sep 30 06:54:05 compute-0 sudo[98524]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:54:05 compute-0 python3.9[98526]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:54:05 compute-0 sudo[98524]: pam_unix(sudo:session): session closed for user root
Sep 30 06:54:06 compute-0 sudo[98676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wkatgkkiudxgysvavlymutwhblqqffvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215245.9867604-742-37903216114782/AnsiballZ_stat.py'
Sep 30 06:54:06 compute-0 sudo[98676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:54:06 compute-0 sudo[98676]: pam_unix(sudo:session): session closed for user root
Sep 30 06:54:07 compute-0 sudo[98799]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqzrhujaagdqkqcrpzaugcqfnahnuvnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215245.9867604-742-37903216114782/AnsiballZ_copy.py'
Sep 30 06:54:07 compute-0 sudo[98799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:54:07 compute-0 sudo[98799]: pam_unix(sudo:session): session closed for user root
Sep 30 06:54:08 compute-0 sudo[98951]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxccvhjietsfucdzcnwimryzvutnzkrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215247.7212892-776-107219441576179/AnsiballZ_container_config_data.py'
Sep 30 06:54:08 compute-0 sudo[98951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:54:08 compute-0 python3.9[98953]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Sep 30 06:54:08 compute-0 sudo[98951]: pam_unix(sudo:session): session closed for user root
Sep 30 06:54:09 compute-0 sudo[99103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdmhimpwuvwwhcwmxeymncdngybuscph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215248.827646-794-44310971445814/AnsiballZ_container_config_hash.py'
Sep 30 06:54:09 compute-0 sudo[99103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:54:09 compute-0 python3.9[99105]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Sep 30 06:54:09 compute-0 sudo[99103]: pam_unix(sudo:session): session closed for user root
Sep 30 06:54:10 compute-0 sudo[99255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypetogpodyczdfwawbldtzfdqgnlrlmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215249.997849-812-139829189792450/AnsiballZ_podman_container_info.py'
Sep 30 06:54:10 compute-0 sudo[99255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:54:10 compute-0 python3.9[99257]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Sep 30 06:54:10 compute-0 sudo[99255]: pam_unix(sudo:session): session closed for user root
Sep 30 06:54:12 compute-0 sudo[99435]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdmtxdgxrgharmgdxreznhctnxweokvv ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759215251.5488462-838-24478085118581/AnsiballZ_edpm_container_manage.py'
Sep 30 06:54:12 compute-0 sudo[99435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:54:12 compute-0 python3[99437]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Sep 30 06:54:12 compute-0 podman[99474]: 2025-09-30 06:54:12.718866737 +0000 UTC m=+0.057899087 container create 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4)
Sep 30 06:54:12 compute-0 podman[99474]: 2025-09-30 06:54:12.684019117 +0000 UTC m=+0.023051507 image pull eeebcc09bc72f81ab45f5ab87eb8f6a7b554b949227aeec082bdb0732754ddc8 38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 06:54:12 compute-0 python3[99437]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z 38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 06:54:12 compute-0 sudo[99435]: pam_unix(sudo:session): session closed for user root
Sep 30 06:54:13 compute-0 sudo[99662]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcgbmwjlwiwmichxjboogurbeotkaqtf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215253.1372058-854-229274525530312/AnsiballZ_stat.py'
Sep 30 06:54:13 compute-0 sudo[99662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:54:13 compute-0 sshd-session[99422]: Received disconnect from 152.32.253.152 port 40690:11: Bye Bye [preauth]
Sep 30 06:54:13 compute-0 sshd-session[99422]: Disconnected from authenticating user root 152.32.253.152 port 40690 [preauth]
Sep 30 06:54:13 compute-0 python3.9[99664]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 06:54:13 compute-0 sudo[99662]: pam_unix(sudo:session): session closed for user root
Sep 30 06:54:14 compute-0 sudo[99826]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtijezunnlvdfnvqmkiglrgevghuhxbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215254.103702-872-105335150406602/AnsiballZ_file.py'
Sep 30 06:54:14 compute-0 sudo[99826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:54:14 compute-0 podman[99790]: 2025-09-30 06:54:14.595951512 +0000 UTC m=+0.156923733 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Sep 30 06:54:14 compute-0 python3.9[99836]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:54:14 compute-0 sudo[99826]: pam_unix(sudo:session): session closed for user root
Sep 30 06:54:15 compute-0 sudo[99919]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oiflozynreqxagloawfoakzoxqdvvxab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215254.103702-872-105335150406602/AnsiballZ_stat.py'
Sep 30 06:54:15 compute-0 sudo[99919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:54:15 compute-0 python3.9[99921]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 06:54:15 compute-0 sudo[99919]: pam_unix(sudo:session): session closed for user root
Sep 30 06:54:16 compute-0 sudo[100070]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzfxczzqugfuhqojpqiayoyjqahnkcpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215255.5789468-872-121679679316365/AnsiballZ_copy.py'
Sep 30 06:54:16 compute-0 sudo[100070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:54:16 compute-0 python3.9[100072]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759215255.5789468-872-121679679316365/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:54:16 compute-0 sudo[100070]: pam_unix(sudo:session): session closed for user root
Sep 30 06:54:16 compute-0 sudo[100146]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxrzyoddrfvptxfdbvhzwarybpezmfgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215255.5789468-872-121679679316365/AnsiballZ_systemd.py'
Sep 30 06:54:16 compute-0 sudo[100146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:54:16 compute-0 python3.9[100148]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 06:54:16 compute-0 systemd[1]: Reloading.
Sep 30 06:54:16 compute-0 systemd-sysv-generator[100174]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 06:54:16 compute-0 systemd-rc-local-generator[100171]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:54:17 compute-0 sudo[100146]: pam_unix(sudo:session): session closed for user root
Sep 30 06:54:17 compute-0 sudo[100257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lahefcnhbqvhpdlzgqabtjfjpvrrsral ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215255.5789468-872-121679679316365/AnsiballZ_systemd.py'
Sep 30 06:54:17 compute-0 sudo[100257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:54:17 compute-0 python3.9[100259]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 06:54:17 compute-0 systemd[1]: Reloading.
Sep 30 06:54:18 compute-0 systemd-rc-local-generator[100289]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:54:18 compute-0 systemd-sysv-generator[100292]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 06:54:18 compute-0 systemd[1]: Starting ovn_metadata_agent container...
Sep 30 06:54:18 compute-0 systemd[1]: Started libcrun container.
Sep 30 06:54:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e47ee22d09a00e6560f39a615edc5da530106052838630ad4aa952015c63bf71/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Sep 30 06:54:18 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e47ee22d09a00e6560f39a615edc5da530106052838630ad4aa952015c63bf71/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 06:54:18 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917.
Sep 30 06:54:18 compute-0 podman[100300]: 2025-09-30 06:54:18.481337891 +0000 UTC m=+0.237846666 container init 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 06:54:18 compute-0 ovn_metadata_agent[100317]: + sudo -E kolla_set_configs
Sep 30 06:54:18 compute-0 podman[100300]: 2025-09-30 06:54:18.50978292 +0000 UTC m=+0.266291655 container start 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent)
Sep 30 06:54:18 compute-0 edpm-start-podman-container[100300]: ovn_metadata_agent
Sep 30 06:54:18 compute-0 edpm-start-podman-container[100299]: Creating additional drop-in dependency for "ovn_metadata_agent" (586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917)
Sep 30 06:54:18 compute-0 ovn_metadata_agent[100317]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Sep 30 06:54:18 compute-0 ovn_metadata_agent[100317]: INFO:__main__:Validating config file
Sep 30 06:54:18 compute-0 ovn_metadata_agent[100317]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Sep 30 06:54:18 compute-0 ovn_metadata_agent[100317]: INFO:__main__:Copying service configuration files
Sep 30 06:54:18 compute-0 ovn_metadata_agent[100317]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Sep 30 06:54:18 compute-0 ovn_metadata_agent[100317]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Sep 30 06:54:18 compute-0 ovn_metadata_agent[100317]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Sep 30 06:54:18 compute-0 ovn_metadata_agent[100317]: INFO:__main__:Writing out command to execute
Sep 30 06:54:18 compute-0 ovn_metadata_agent[100317]: INFO:__main__:Setting permission for /var/lib/neutron
Sep 30 06:54:18 compute-0 ovn_metadata_agent[100317]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Sep 30 06:54:18 compute-0 ovn_metadata_agent[100317]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Sep 30 06:54:18 compute-0 ovn_metadata_agent[100317]: INFO:__main__:Setting permission for /var/lib/neutron/external
Sep 30 06:54:18 compute-0 ovn_metadata_agent[100317]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Sep 30 06:54:18 compute-0 systemd[1]: Reloading.
Sep 30 06:54:18 compute-0 ovn_metadata_agent[100317]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Sep 30 06:54:18 compute-0 ovn_metadata_agent[100317]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Sep 30 06:54:18 compute-0 ovn_metadata_agent[100317]: ++ cat /run_command
Sep 30 06:54:18 compute-0 podman[100324]: 2025-09-30 06:54:18.614873223 +0000 UTC m=+0.085684281 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.build-date=20250930)
Sep 30 06:54:18 compute-0 ovn_metadata_agent[100317]: + CMD=neutron-ovn-metadata-agent
Sep 30 06:54:18 compute-0 ovn_metadata_agent[100317]: + ARGS=
Sep 30 06:54:18 compute-0 ovn_metadata_agent[100317]: + sudo kolla_copy_cacerts
Sep 30 06:54:18 compute-0 ovn_metadata_agent[100317]: + [[ ! -n '' ]]
Sep 30 06:54:18 compute-0 ovn_metadata_agent[100317]: + . kolla_extend_start
Sep 30 06:54:18 compute-0 ovn_metadata_agent[100317]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Sep 30 06:54:18 compute-0 ovn_metadata_agent[100317]: Running command: 'neutron-ovn-metadata-agent'
Sep 30 06:54:18 compute-0 ovn_metadata_agent[100317]: + umask 0022
Sep 30 06:54:18 compute-0 ovn_metadata_agent[100317]: + exec neutron-ovn-metadata-agent
Sep 30 06:54:18 compute-0 systemd-rc-local-generator[100393]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:54:18 compute-0 systemd-sysv-generator[100399]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 06:54:18 compute-0 systemd[1]: Started ovn_metadata_agent container.
Sep 30 06:54:18 compute-0 sudo[100257]: pam_unix(sudo:session): session closed for user root
Sep 30 06:54:19 compute-0 sshd-session[92040]: Connection closed by 192.168.122.30 port 39996
Sep 30 06:54:19 compute-0 sshd-session[92037]: pam_unix(sshd:session): session closed for user zuul
Sep 30 06:54:19 compute-0 systemd[1]: session-23.scope: Deactivated successfully.
Sep 30 06:54:19 compute-0 systemd[1]: session-23.scope: Consumed 42.742s CPU time.
Sep 30 06:54:19 compute-0 systemd-logind[824]: Session 23 logged out. Waiting for processes to exit.
Sep 30 06:54:19 compute-0 systemd-logind[824]: Removed session 23.
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.424 100322 INFO neutron.common.config [-] Logging enabled!
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.424 100322 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 26.1.0.dev268
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.424 100322 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.12/site-packages/neutron/common/config.py:124
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.424 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2804
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.424 100322 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2805
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.425 100322 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2806
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.425 100322 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2807
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.425 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2809
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.425 100322 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.425 100322 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.425 100322 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.425 100322 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.425 100322 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.425 100322 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.425 100322 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.425 100322 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.425 100322 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.425 100322 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.425 100322 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.426 100322 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.426 100322 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.426 100322 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.427 100322 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.427 100322 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.427 100322 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.428 100322 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.428 100322 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.428 100322 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.428 100322 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.428 100322 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.429 100322 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.429 100322 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.429 100322 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.429 100322 DEBUG neutron.agent.ovn.metadata_agent [-] enable_signals                 = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.429 100322 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.429 100322 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.429 100322 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.430 100322 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.430 100322 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.430 100322 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.430 100322 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.430 100322 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.430 100322 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.431 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.431 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.431 100322 DEBUG neutron.agent.ovn.metadata_agent [-] log_color                      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.431 100322 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.431 100322 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.431 100322 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.431 100322 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.432 100322 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.432 100322 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.432 100322 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.432 100322 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.432 100322 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.432 100322 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.432 100322 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.433 100322 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.433 100322 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.433 100322 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.433 100322 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.433 100322 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.433 100322 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.433 100322 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.433 100322 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.434 100322 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.434 100322 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.434 100322 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.434 100322 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.434 100322 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.434 100322 DEBUG neutron.agent.ovn.metadata_agent [-] my_ip                          = 38.102.83.22 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.435 100322 DEBUG neutron.agent.ovn.metadata_agent [-] my_ipv6                        = ::1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.435 100322 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.435 100322 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.435 100322 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.435 100322 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.435 100322 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.436 100322 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.436 100322 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.436 100322 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.436 100322 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.436 100322 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.436 100322 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.436 100322 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.437 100322 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.437 100322 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.437 100322 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.437 100322 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.437 100322 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.437 100322 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.437 100322 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.438 100322 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.438 100322 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.438 100322 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.438 100322 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.438 100322 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.438 100322 DEBUG neutron.agent.ovn.metadata_agent [-] shell_completion               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.439 100322 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.439 100322 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.439 100322 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.439 100322 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.439 100322 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.439 100322 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.439 100322 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.440 100322 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.440 100322 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.440 100322 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_qinq                      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.440 100322 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.440 100322 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.440 100322 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.440 100322 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.441 100322 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.441 100322 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.441 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.441 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.441 100322 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.441 100322 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.441 100322 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.442 100322 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.442 100322 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.442 100322 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.442 100322 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.442 100322 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.442 100322 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.443 100322 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_requests        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.443 100322 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.443 100322 DEBUG neutron.agent.ovn.metadata_agent [-] profiler_jaeger.process_tags   = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.443 100322 DEBUG neutron.agent.ovn.metadata_agent [-] profiler_jaeger.service_name_prefix = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.443 100322 DEBUG neutron.agent.ovn.metadata_agent [-] profiler_otlp.service_name_prefix = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.443 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.443 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.444 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.444 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.444 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.444 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.444 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.444 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.445 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.445 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.445 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.445 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.445 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.445 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.445 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.446 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_timeout     = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.446 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.446 100322 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.446 100322 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.446 100322 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.446 100322 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.446 100322 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.log_daemon_traceback   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.447 100322 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.447 100322 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.447 100322 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.447 100322 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.447 100322 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.447 100322 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.447 100322 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.448 100322 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.448 100322 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.448 100322 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.448 100322 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.448 100322 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.448 100322 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.448 100322 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.449 100322 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.449 100322 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.449 100322 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.449 100322 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.449 100322 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.449 100322 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.449 100322 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.450 100322 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.450 100322 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.450 100322 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.450 100322 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.450 100322 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.450 100322 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.451 100322 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.451 100322 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.451 100322 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.451 100322 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.451 100322 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.451 100322 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.451 100322 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.452 100322 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.452 100322 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.452 100322 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.452 100322 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.452 100322 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.452 100322 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.452 100322 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.453 100322 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.453 100322 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.453 100322 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.453 100322 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.453 100322 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.453 100322 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.453 100322 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.454 100322 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mappings            = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.454 100322 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.datapath_type              = system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.454 100322 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_flood                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.454 100322 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_flood_reports         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.454 100322 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_flood_unregistered    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.454 100322 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.454 100322 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.int_peer_patch_port        = patch-tun log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.455 100322 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.integration_bridge         = br-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.455 100322 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.local_ip                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.455 100322 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_connect_timeout         = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.455 100322 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_inactivity_probe        = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.455 100322 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_listen_address          = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.455 100322 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_listen_port             = 6633 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.455 100322 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.of_request_timeout         = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.455 100322 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.openflow_processed_per_port = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.456 100322 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.456 100322 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_debug                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.456 100322 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.456 100322 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.qos_meter_bandwidth        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.456 100322 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_bandwidths = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.456 100322 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_default_hypervisor = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.456 100322 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_hypervisors = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.457 100322 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_inventory_defaults = {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.457 100322 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_packet_processing_inventory_defaults = {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.457 100322 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_packet_processing_with_direction = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.457 100322 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.resource_provider_packet_processing_without_direction = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.457 100322 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ssl_ca_cert_file           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.457 100322 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ssl_cert_file              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.458 100322 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ssl_key_file               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.458 100322 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.tun_peer_patch_port        = patch-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.458 100322 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.tunnel_bridge              = br-tun log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.458 100322 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.vhostuser_socket_dir       = /var/run/openvswitch log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.458 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.458 100322 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.458 100322 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.459 100322 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.459 100322 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.459 100322 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.459 100322 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.459 100322 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.459 100322 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.459 100322 DEBUG neutron.agent.ovn.metadata_agent [-] agent.extensions               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.460 100322 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.460 100322 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.460 100322 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.460 100322 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.461 100322 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.461 100322 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.461 100322 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.461 100322 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.462 100322 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.462 100322 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.462 100322 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.462 100322 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.463 100322 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.463 100322 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.463 100322 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.463 100322 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.463 100322 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.464 100322 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.464 100322 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.464 100322 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.464 100322 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.464 100322 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.465 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.465 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.465 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.465 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.465 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.466 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.466 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.466 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.466 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.467 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.467 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.467 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.467 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.467 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.468 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.468 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.468 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.468 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.468 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.469 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.469 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.469 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.469 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.469 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.470 100322 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.470 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.broadcast_arps_to_all_routers = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.470 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.470 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.471 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_records_ovn_owned      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.471 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.471 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.471 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.fdb_age_threshold          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.471 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.live_migration_activation_strategy = rarp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.472 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.localnet_learn_fdb         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.472 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.mac_binding_age_threshold  = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.472 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.472 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.472 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.473 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.473 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.473 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.473 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.474 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.474 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = ['tcp:127.0.0.1:6641'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.474 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.474 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_router_indirect_snat   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.474 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.475 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.475 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ['ssl:ovsdbserver-sb.openstack.svc:6642'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.475 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.476 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.476 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.476 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.476 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.476 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.477 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ovn_nb_global.fdb_removal_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.477 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ovn_nb_global.ignore_lsp_down  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.477 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ovn_nb_global.mac_binding_removal_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.477 100322 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.base_query_rate_limit = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.477 100322 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.base_window_duration = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.478 100322 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.burst_query_rate_limit = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.478 100322 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.burst_window_duration = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.478 100322 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.ip_versions = [4] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.478 100322 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_rate_limiting.rate_limit_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.479 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.479 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.479 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.479 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.479 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.480 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.480 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.480 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.480 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.481 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.481 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.481 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.hostname = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.481 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.481 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.482 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.482 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.482 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_splay = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.482 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.processname = neutron-ovn-metadata-agent log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.483 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.483 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.483 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.483 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.483 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.484 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.484 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.484 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.484 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.484 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.485 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_stream_fanout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.485 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.485 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_quorum_queue = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.485 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.485 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.486 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.486 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.486 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.486 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.487 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.487 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.use_queue_manager = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.487 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.487 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.487 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.488 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.488 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_reports.file_event_handler = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.488 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.488 100322 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.488 100322 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2828
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.532 100322 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.532 100322 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.533 100322 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.533 100322 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.533 100322 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.552 100322 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 01429670-4ea1-4dab-babc-4bc628cc01bb (UUID: 01429670-4ea1-4dab-babc-4bc628cc01bb) and ovn bridge br-int. _load_config /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:419
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.589 100322 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.590 100322 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.590 100322 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Port_Binding.logical_port autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.590 100322 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.590 100322 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.595 100322 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.603 100322 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.613 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '01429670-4ea1-4dab-babc-4bc628cc01bb'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], external_ids={}, name=01429670-4ea1-4dab-babc-4bc628cc01bb, nb_cfg_timestamp=1759215203436, nb_cfg=1) old= matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 06:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:20.617 100322 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmp1bkdyc1e/privsep.sock']
Sep 30 06:54:21 compute-0 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Sep 30 06:54:21 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:21.377 100322 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Sep 30 06:54:21 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:21.378 100322 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp1bkdyc1e/privsep.sock __init__ /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:377
Sep 30 06:54:21 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:21.216 100440 INFO oslo.privsep.daemon [-] privsep daemon starting
Sep 30 06:54:21 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:21.222 100440 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Sep 30 06:54:21 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:21.225 100440 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Sep 30 06:54:21 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:21.226 100440 INFO oslo.privsep.daemon [-] privsep daemon running as pid 100440
Sep 30 06:54:21 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:21.381 100440 DEBUG oslo.privsep.daemon [-] privsep: reply[7c869c3f-643a-47dc-81da-0ae6838e2d86]: (2,) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 06:54:21 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:21.862 100440 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 06:54:21 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:21.863 100440 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 06:54:21 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:21.863 100440 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 06:54:22 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:22.270 100440 INFO oslo_service.backend [-] Loading backend: eventlet
Sep 30 06:54:22 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:22.275 100440 INFO oslo_service.backend [-] Backend 'eventlet' successfully loaded and cached.
Sep 30 06:54:22 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:22.308 100440 DEBUG oslo.privsep.daemon [-] privsep: reply[7c37642b-857d-4de0-a68f-4875af51300e]: (4, []) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 06:54:22 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:22.309 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=01429670-4ea1-4dab-babc-4bc628cc01bb, column=external_ids, values=({'neutron:ovn-metadata-id': '5dfbf58a-a9d5-5f47-a1a1-c178dab8726f'},)) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 06:54:22 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:22.316 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=01429670-4ea1-4dab-babc-4bc628cc01bb, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 06:54:22 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:54:22.321 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=01429670-4ea1-4dab-babc-4bc628cc01bb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '1'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 06:54:24 compute-0 sshd-session[100445]: Accepted publickey for zuul from 192.168.122.30 port 46414 ssh2: ECDSA SHA256:VgXY+3KEFg6ByVjpOVk/qpSKqXtLqTtx1W0gQMfs9wE
Sep 30 06:54:24 compute-0 systemd-logind[824]: New session 24 of user zuul.
Sep 30 06:54:24 compute-0 systemd[1]: Started Session 24 of User zuul.
Sep 30 06:54:24 compute-0 sshd-session[100445]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 06:54:26 compute-0 python3.9[100598]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 06:54:27 compute-0 sudo[100752]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mivkeecwgvipzoyrxjbmyjgbneoocrin ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215266.6735704-48-91819339618735/AnsiballZ_command.py'
Sep 30 06:54:27 compute-0 sudo[100752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:54:27 compute-0 python3.9[100754]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:54:27 compute-0 sudo[100752]: pam_unix(sudo:session): session closed for user root
Sep 30 06:54:28 compute-0 sudo[100918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csqzboatmhjfayvkpojaatejcsndatjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215267.9910932-70-76953452739793/AnsiballZ_systemd_service.py'
Sep 30 06:54:28 compute-0 sudo[100918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:54:28 compute-0 python3.9[100920]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 06:54:28 compute-0 systemd[1]: Reloading.
Sep 30 06:54:29 compute-0 systemd-rc-local-generator[100946]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:54:29 compute-0 systemd-sysv-generator[100950]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 06:54:29 compute-0 sudo[100918]: pam_unix(sudo:session): session closed for user root
Sep 30 06:54:30 compute-0 python3.9[101105]: ansible-ansible.builtin.service_facts Invoked
Sep 30 06:54:30 compute-0 network[101124]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Sep 30 06:54:30 compute-0 network[101125]: 'network-scripts' will be removed from distribution in near future.
Sep 30 06:54:30 compute-0 network[101126]: It is advised to switch to 'NetworkManager' instead for network management.
Sep 30 06:54:31 compute-0 sshd-session[101106]: Received disconnect from 193.46.255.103 port 39902:11:  [preauth]
Sep 30 06:54:31 compute-0 sshd-session[101106]: Disconnected from authenticating user root 193.46.255.103 port 39902 [preauth]
Sep 30 06:54:36 compute-0 sshd-session[101263]: Connection closed by authenticating user root 155.94.170.162 port 55768 [preauth]
Sep 30 06:54:36 compute-0 sshd-session[101265]: Connection closed by authenticating user root 155.94.170.162 port 55774 [preauth]
Sep 30 06:54:38 compute-0 sudo[101392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhyvyowrxhksmklhcqhdmkcorxfjaxhm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215278.0955226-108-140787689918325/AnsiballZ_systemd_service.py'
Sep 30 06:54:38 compute-0 sudo[101392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:54:38 compute-0 python3.9[101394]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 06:54:38 compute-0 sudo[101392]: pam_unix(sudo:session): session closed for user root
Sep 30 06:54:39 compute-0 sudo[101545]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-opxeajoicfxogugpbpfclguzyorqyyqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215279.042478-108-111437773597472/AnsiballZ_systemd_service.py'
Sep 30 06:54:39 compute-0 sudo[101545]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:54:39 compute-0 python3.9[101547]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 06:54:39 compute-0 sudo[101545]: pam_unix(sudo:session): session closed for user root
Sep 30 06:54:40 compute-0 sudo[101698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcrcsmshcfpbhlvrsvxrzyasgmlvtedr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215279.9589384-108-196251242386849/AnsiballZ_systemd_service.py'
Sep 30 06:54:40 compute-0 sudo[101698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:54:40 compute-0 python3.9[101700]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 06:54:40 compute-0 sudo[101698]: pam_unix(sudo:session): session closed for user root
Sep 30 06:54:41 compute-0 sudo[101851]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rxzajanpcvubpsvqrqbzfgogesogrfef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215280.9326103-108-12378982215242/AnsiballZ_systemd_service.py'
Sep 30 06:54:41 compute-0 sudo[101851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:54:41 compute-0 python3.9[101853]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 06:54:42 compute-0 sudo[101851]: pam_unix(sudo:session): session closed for user root
Sep 30 06:54:43 compute-0 sudo[102004]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbpnlvxrrflipowezfrwymqpbkdmmrwp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215282.9969783-108-121103935798906/AnsiballZ_systemd_service.py'
Sep 30 06:54:43 compute-0 sudo[102004]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:54:43 compute-0 python3.9[102006]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 06:54:43 compute-0 sudo[102004]: pam_unix(sudo:session): session closed for user root
Sep 30 06:54:44 compute-0 sudo[102157]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jyyqakculjuwkedtqxjewrahvgxebszv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215283.9346454-108-158422530763297/AnsiballZ_systemd_service.py'
Sep 30 06:54:44 compute-0 sudo[102157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:54:44 compute-0 python3.9[102159]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 06:54:44 compute-0 sudo[102157]: pam_unix(sudo:session): session closed for user root
Sep 30 06:54:44 compute-0 podman[102161]: 2025-09-30 06:54:44.814101963 +0000 UTC m=+0.126745021 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller)
Sep 30 06:54:45 compute-0 sudo[102337]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avbujzfmzrdzgtysennybiqdxqzunjbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215284.8780904-108-246936687399094/AnsiballZ_systemd_service.py'
Sep 30 06:54:45 compute-0 sudo[102337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:54:45 compute-0 python3.9[102339]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 06:54:46 compute-0 sudo[102337]: pam_unix(sudo:session): session closed for user root
Sep 30 06:54:47 compute-0 sudo[102490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubknxyycefighsmsibrccpvrwdcfmhmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215287.2716928-212-242054339217602/AnsiballZ_file.py'
Sep 30 06:54:47 compute-0 sudo[102490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:54:48 compute-0 python3.9[102492]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:54:48 compute-0 sudo[102490]: pam_unix(sudo:session): session closed for user root
Sep 30 06:54:48 compute-0 sudo[102642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rirzigsadfzcggbtgaqsiltptlszaujv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215288.195516-212-274626855647372/AnsiballZ_file.py'
Sep 30 06:54:48 compute-0 sudo[102642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:54:48 compute-0 python3.9[102644]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:54:48 compute-0 sudo[102642]: pam_unix(sudo:session): session closed for user root
Sep 30 06:54:49 compute-0 podman[102768]: 2025-09-30 06:54:49.347226154 +0000 UTC m=+0.069035541 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team)
Sep 30 06:54:49 compute-0 sudo[102811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-isjlkfnxgtimpgskqxpxjfxzdufecudf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215289.002379-212-273511740013056/AnsiballZ_file.py'
Sep 30 06:54:49 compute-0 sudo[102811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:54:49 compute-0 python3.9[102815]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:54:49 compute-0 sudo[102811]: pam_unix(sudo:session): session closed for user root
Sep 30 06:54:50 compute-0 sudo[102965]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jvcncjoynfxdwqewguxtbsgynwjumvup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215289.7600284-212-56938082478425/AnsiballZ_file.py'
Sep 30 06:54:50 compute-0 sudo[102965]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:54:50 compute-0 python3.9[102967]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:54:50 compute-0 sudo[102965]: pam_unix(sudo:session): session closed for user root
Sep 30 06:54:50 compute-0 sudo[103117]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msbywyltctlxhepfjhlzhdygpujzjivy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215290.574004-212-158956055788867/AnsiballZ_file.py'
Sep 30 06:54:50 compute-0 sudo[103117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:54:51 compute-0 python3.9[103119]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:54:51 compute-0 sudo[103117]: pam_unix(sudo:session): session closed for user root
Sep 30 06:54:51 compute-0 sudo[103269]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dskffyjliwdycaeynlwolrtlknjwgmsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215291.3550253-212-268250113193556/AnsiballZ_file.py'
Sep 30 06:54:51 compute-0 sudo[103269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:54:51 compute-0 python3.9[103271]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:54:51 compute-0 sudo[103269]: pam_unix(sudo:session): session closed for user root
Sep 30 06:54:52 compute-0 sudo[103421]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ooundfowlvarxtzhgbvkqgepngsviczc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215292.1331165-212-156467492511927/AnsiballZ_file.py'
Sep 30 06:54:52 compute-0 sudo[103421]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:54:52 compute-0 python3.9[103423]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:54:52 compute-0 sudo[103421]: pam_unix(sudo:session): session closed for user root
Sep 30 06:54:53 compute-0 sudo[103573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmzlhnqaofugrnvvlwmapicnuiicwzej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215293.1088083-312-244811930518422/AnsiballZ_file.py'
Sep 30 06:54:53 compute-0 sudo[103573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:54:53 compute-0 python3.9[103575]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:54:53 compute-0 sudo[103573]: pam_unix(sudo:session): session closed for user root
Sep 30 06:54:54 compute-0 sudo[103725]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjiwaqgprhizsonkncekpirihwdmflkd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215293.9184244-312-112391579876337/AnsiballZ_file.py'
Sep 30 06:54:54 compute-0 sudo[103725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:54:54 compute-0 python3.9[103727]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:54:54 compute-0 sudo[103725]: pam_unix(sudo:session): session closed for user root
Sep 30 06:54:55 compute-0 sudo[103877]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdvjgzyeyzegvoejwqzcenlwibexewki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215294.73034-312-242952883157436/AnsiballZ_file.py'
Sep 30 06:54:55 compute-0 sudo[103877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:54:55 compute-0 python3.9[103879]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:54:55 compute-0 sudo[103877]: pam_unix(sudo:session): session closed for user root
Sep 30 06:54:55 compute-0 sudo[104029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dltkodbevhktaqxueuacowixbapwslhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215295.453779-312-29788548182098/AnsiballZ_file.py'
Sep 30 06:54:55 compute-0 sudo[104029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:54:56 compute-0 python3.9[104031]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:54:56 compute-0 sudo[104029]: pam_unix(sudo:session): session closed for user root
Sep 30 06:54:56 compute-0 sudo[104181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjoihziijixoftnsfduajkroguzmcmng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215296.25381-312-80804016478026/AnsiballZ_file.py'
Sep 30 06:54:56 compute-0 sudo[104181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:54:56 compute-0 python3.9[104183]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:54:56 compute-0 sudo[104181]: pam_unix(sudo:session): session closed for user root
Sep 30 06:54:57 compute-0 sudo[104333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyqqhjpaccmlnjnrszbenchhtogtuatr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215297.0891702-312-254533666071455/AnsiballZ_file.py'
Sep 30 06:54:57 compute-0 sudo[104333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:54:57 compute-0 python3.9[104335]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:54:57 compute-0 sudo[104333]: pam_unix(sudo:session): session closed for user root
Sep 30 06:54:58 compute-0 sudo[104485]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpyumxwqhovhqdtamkybjwmpcpsafgph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215297.9431727-312-6913229408469/AnsiballZ_file.py'
Sep 30 06:54:58 compute-0 sudo[104485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:54:58 compute-0 python3.9[104487]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:54:58 compute-0 sudo[104485]: pam_unix(sudo:session): session closed for user root
Sep 30 06:54:59 compute-0 sudo[104637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nezugwgbnuvvjqnsugevhmnrpcfjddqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215298.9194145-414-204231490434362/AnsiballZ_command.py'
Sep 30 06:54:59 compute-0 sudo[104637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:54:59 compute-0 python3.9[104639]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:54:59 compute-0 sudo[104637]: pam_unix(sudo:session): session closed for user root
Sep 30 06:55:00 compute-0 python3.9[104791]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Sep 30 06:55:01 compute-0 sudo[104941]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kidmxlyqbtwvuuawebjjbcitftqjieua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215300.7886546-450-207042899592719/AnsiballZ_systemd_service.py'
Sep 30 06:55:01 compute-0 sudo[104941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:55:01 compute-0 python3.9[104943]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 06:55:01 compute-0 systemd[1]: Reloading.
Sep 30 06:55:01 compute-0 systemd-sysv-generator[104974]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 06:55:01 compute-0 systemd-rc-local-generator[104971]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:55:01 compute-0 sudo[104941]: pam_unix(sudo:session): session closed for user root
Sep 30 06:55:02 compute-0 sudo[105128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ovlssndnrhknureadmbcqdyuocdkbotb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215302.0116794-466-10721907686510/AnsiballZ_command.py'
Sep 30 06:55:02 compute-0 sudo[105128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:55:02 compute-0 python3.9[105130]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:55:02 compute-0 sudo[105128]: pam_unix(sudo:session): session closed for user root
Sep 30 06:55:03 compute-0 sudo[105281]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tejupzajiuhbkefixjfoynmgndbjawvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215302.8377442-466-167989993237748/AnsiballZ_command.py'
Sep 30 06:55:03 compute-0 sudo[105281]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:55:03 compute-0 python3.9[105283]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:55:03 compute-0 sudo[105281]: pam_unix(sudo:session): session closed for user root
Sep 30 06:55:04 compute-0 sudo[105434]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-guccxlupqahjvcyncymazrvwiwuyxing ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215303.6170137-466-228864049331527/AnsiballZ_command.py'
Sep 30 06:55:04 compute-0 sudo[105434]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:55:04 compute-0 python3.9[105436]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:55:04 compute-0 sudo[105434]: pam_unix(sudo:session): session closed for user root
Sep 30 06:55:04 compute-0 sudo[105587]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gardilmdwqijfwhhpvkldfqudsjlcjld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215304.459365-466-53960090249832/AnsiballZ_command.py'
Sep 30 06:55:04 compute-0 sudo[105587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:55:05 compute-0 python3.9[105589]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:55:05 compute-0 sudo[105587]: pam_unix(sudo:session): session closed for user root
Sep 30 06:55:05 compute-0 sudo[105740]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjvejrbdckzwwuzlcecgluflkwdogubn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215305.249445-466-160457285499296/AnsiballZ_command.py'
Sep 30 06:55:05 compute-0 sudo[105740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:55:05 compute-0 python3.9[105742]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:55:05 compute-0 sudo[105740]: pam_unix(sudo:session): session closed for user root
Sep 30 06:55:06 compute-0 sudo[105893]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-logblyyrvevwjfjxtnatpzzvqfphtruu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215306.0504558-466-259758743411917/AnsiballZ_command.py'
Sep 30 06:55:06 compute-0 sudo[105893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:55:06 compute-0 python3.9[105895]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:55:06 compute-0 sudo[105893]: pam_unix(sudo:session): session closed for user root
Sep 30 06:55:07 compute-0 sudo[106046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfsrcireptcmtjajklqqdixsdzldezkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215306.865491-466-89150656038982/AnsiballZ_command.py'
Sep 30 06:55:07 compute-0 sudo[106046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:55:07 compute-0 python3.9[106048]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:55:07 compute-0 sudo[106046]: pam_unix(sudo:session): session closed for user root
Sep 30 06:55:08 compute-0 sudo[106199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbitfkyeemtfnbiptuquceobdodbeueb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215308.044931-574-82464830469389/AnsiballZ_getent.py'
Sep 30 06:55:08 compute-0 sudo[106199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:55:08 compute-0 python3.9[106201]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Sep 30 06:55:08 compute-0 sudo[106199]: pam_unix(sudo:session): session closed for user root
Sep 30 06:55:09 compute-0 sudo[106352]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-illktyubzborcpvbegycbbdmpbqpxmnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215309.0554373-590-113253197215246/AnsiballZ_group.py'
Sep 30 06:55:09 compute-0 sudo[106352]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:55:09 compute-0 python3.9[106354]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Sep 30 06:55:09 compute-0 groupadd[106355]: group added to /etc/group: name=libvirt, GID=42473
Sep 30 06:55:09 compute-0 groupadd[106355]: group added to /etc/gshadow: name=libvirt
Sep 30 06:55:09 compute-0 groupadd[106355]: new group: name=libvirt, GID=42473
Sep 30 06:55:09 compute-0 sudo[106352]: pam_unix(sudo:session): session closed for user root
Sep 30 06:55:10 compute-0 sudo[106510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqwetezotsguzflcdcoremgyfgqgzdvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215310.2008662-606-112337510147046/AnsiballZ_user.py'
Sep 30 06:55:10 compute-0 sudo[106510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:55:11 compute-0 python3.9[106512]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Sep 30 06:55:11 compute-0 useradd[106514]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Sep 30 06:55:11 compute-0 sudo[106510]: pam_unix(sudo:session): session closed for user root
Sep 30 06:55:12 compute-0 sudo[106670]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xytidtbmkmttersaixxmiqmaqkscizfl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215311.6431522-628-103427144109785/AnsiballZ_setup.py'
Sep 30 06:55:12 compute-0 sudo[106670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:55:12 compute-0 python3.9[106672]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 06:55:12 compute-0 sudo[106670]: pam_unix(sudo:session): session closed for user root
Sep 30 06:55:13 compute-0 sudo[106754]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uggoeqlbhwyrzxuephzhjlslpbbnrqff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215311.6431522-628-103427144109785/AnsiballZ_dnf.py'
Sep 30 06:55:13 compute-0 sudo[106754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:55:13 compute-0 python3.9[106756]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 06:55:15 compute-0 podman[106765]: 2025-09-30 06:55:15.549999828 +0000 UTC m=+0.126545746 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Sep 30 06:55:19 compute-0 podman[106821]: 2025-09-30 06:55:19.485625564 +0000 UTC m=+0.071534131 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 06:55:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:55:20.490 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 06:55:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:55:20.491 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 06:55:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:55:20.491 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 06:55:38 compute-0 kernel: SELinux:  Converting 2751 SID table entries...
Sep 30 06:55:38 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 06:55:38 compute-0 kernel: SELinux:  policy capability open_perms=1
Sep 30 06:55:38 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 06:55:38 compute-0 kernel: SELinux:  policy capability always_check_network=0
Sep 30 06:55:38 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 06:55:38 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 06:55:38 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 06:55:46 compute-0 dbus-broker-launch[814]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Sep 30 06:55:46 compute-0 podman[107001]: 2025-09-30 06:55:46.593754268 +0000 UTC m=+0.156742068 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Sep 30 06:55:47 compute-0 kernel: SELinux:  Converting 2751 SID table entries...
Sep 30 06:55:47 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 06:55:47 compute-0 kernel: SELinux:  policy capability open_perms=1
Sep 30 06:55:47 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 06:55:47 compute-0 kernel: SELinux:  policy capability always_check_network=0
Sep 30 06:55:47 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 06:55:47 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 06:55:47 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 06:55:50 compute-0 dbus-broker-launch[814]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Sep 30 06:55:50 compute-0 podman[107034]: 2025-09-30 06:55:50.544814382 +0000 UTC m=+0.104980391 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20250930)
Sep 30 06:56:17 compute-0 podman[115272]: 2025-09-30 06:56:17.553313819 +0000 UTC m=+0.130153188 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Sep 30 06:56:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:56:20.492 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 06:56:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:56:20.493 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 06:56:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:56:20.493 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 06:56:21 compute-0 podman[117082]: 2025-09-30 06:56:21.507588436 +0000 UTC m=+0.084489703 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20250930)
Sep 30 06:56:48 compute-0 kernel: SELinux:  Converting 2752 SID table entries...
Sep 30 06:56:48 compute-0 kernel: SELinux:  policy capability network_peer_controls=1
Sep 30 06:56:48 compute-0 kernel: SELinux:  policy capability open_perms=1
Sep 30 06:56:48 compute-0 kernel: SELinux:  policy capability extended_socket_class=1
Sep 30 06:56:48 compute-0 kernel: SELinux:  policy capability always_check_network=0
Sep 30 06:56:48 compute-0 kernel: SELinux:  policy capability cgroup_seclabel=1
Sep 30 06:56:48 compute-0 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Sep 30 06:56:48 compute-0 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Sep 30 06:56:48 compute-0 dbus-broker-launch[814]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Sep 30 06:56:48 compute-0 podman[123848]: 2025-09-30 06:56:48.598001155 +0000 UTC m=+0.162503894 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, config_id=ovn_controller, org.label-schema.build-date=20250930, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team)
Sep 30 06:56:49 compute-0 groupadd[123880]: group added to /etc/group: name=dnsmasq, GID=992
Sep 30 06:56:49 compute-0 groupadd[123880]: group added to /etc/gshadow: name=dnsmasq
Sep 30 06:56:49 compute-0 groupadd[123880]: new group: name=dnsmasq, GID=992
Sep 30 06:56:49 compute-0 useradd[123887]: new user: name=dnsmasq, UID=992, GID=992, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Sep 30 06:56:49 compute-0 dbus-broker-launch[807]: Noticed file-system modification, trigger reload.
Sep 30 06:56:49 compute-0 dbus-broker-launch[807]: Noticed file-system modification, trigger reload.
Sep 30 06:56:50 compute-0 groupadd[123900]: group added to /etc/group: name=clevis, GID=991
Sep 30 06:56:50 compute-0 groupadd[123900]: group added to /etc/gshadow: name=clevis
Sep 30 06:56:50 compute-0 groupadd[123900]: new group: name=clevis, GID=991
Sep 30 06:56:50 compute-0 useradd[123907]: new user: name=clevis, UID=991, GID=991, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Sep 30 06:56:50 compute-0 usermod[123917]: add 'clevis' to group 'tss'
Sep 30 06:56:50 compute-0 usermod[123917]: add 'clevis' to shadow group 'tss'
Sep 30 06:56:51 compute-0 podman[123931]: 2025-09-30 06:56:51.649744348 +0000 UTC m=+0.069790890 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest)
Sep 30 06:56:53 compute-0 polkitd[6182]: Reloading rules
Sep 30 06:56:53 compute-0 polkitd[6182]: Collecting garbage unconditionally...
Sep 30 06:56:53 compute-0 polkitd[6182]: Loading rules from directory /etc/polkit-1/rules.d
Sep 30 06:56:53 compute-0 polkitd[6182]: Loading rules from directory /usr/share/polkit-1/rules.d
Sep 30 06:56:53 compute-0 polkitd[6182]: Finished loading, compiling and executing 4 rules
Sep 30 06:56:53 compute-0 polkitd[6182]: Reloading rules
Sep 30 06:56:53 compute-0 polkitd[6182]: Collecting garbage unconditionally...
Sep 30 06:56:53 compute-0 polkitd[6182]: Loading rules from directory /etc/polkit-1/rules.d
Sep 30 06:56:53 compute-0 polkitd[6182]: Loading rules from directory /usr/share/polkit-1/rules.d
Sep 30 06:56:53 compute-0 polkitd[6182]: Finished loading, compiling and executing 4 rules
Sep 30 06:56:54 compute-0 groupadd[124123]: group added to /etc/group: name=ceph, GID=167
Sep 30 06:56:54 compute-0 groupadd[124123]: group added to /etc/gshadow: name=ceph
Sep 30 06:56:54 compute-0 groupadd[124123]: new group: name=ceph, GID=167
Sep 30 06:56:54 compute-0 useradd[124129]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Sep 30 06:56:57 compute-0 systemd[1]: Stopping OpenSSH server daemon...
Sep 30 06:56:57 compute-0 sshd[1008]: Received signal 15; terminating.
Sep 30 06:56:57 compute-0 systemd[1]: sshd.service: Deactivated successfully.
Sep 30 06:56:57 compute-0 systemd[1]: Stopped OpenSSH server daemon.
Sep 30 06:56:57 compute-0 systemd[1]: sshd.service: Consumed 3.934s CPU time, read 0B from disk, written 88.0K to disk.
Sep 30 06:56:57 compute-0 systemd[1]: Stopped target sshd-keygen.target.
Sep 30 06:56:57 compute-0 systemd[1]: Stopping sshd-keygen.target...
Sep 30 06:56:57 compute-0 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Sep 30 06:56:57 compute-0 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Sep 30 06:56:57 compute-0 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Sep 30 06:56:57 compute-0 systemd[1]: Reached target sshd-keygen.target.
Sep 30 06:56:57 compute-0 systemd[1]: Starting OpenSSH server daemon...
Sep 30 06:56:57 compute-0 sshd[124648]: Server listening on 0.0.0.0 port 22.
Sep 30 06:56:57 compute-0 sshd[124648]: Server listening on :: port 22.
Sep 30 06:56:57 compute-0 systemd[1]: Started OpenSSH server daemon.
Sep 30 06:56:59 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Sep 30 06:56:59 compute-0 systemd[1]: Starting man-db-cache-update.service...
Sep 30 06:57:00 compute-0 systemd[1]: Reloading.
Sep 30 06:57:00 compute-0 systemd-rc-local-generator[124902]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:57:00 compute-0 systemd-sysv-generator[124908]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 06:57:00 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Sep 30 06:57:02 compute-0 systemd[1]: Starting PackageKit Daemon...
Sep 30 06:57:02 compute-0 PackageKit[126896]: daemon start
Sep 30 06:57:02 compute-0 systemd[1]: Started PackageKit Daemon.
Sep 30 06:57:02 compute-0 sudo[106754]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:03 compute-0 sudo[128277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkxutiytoudlzgcgumtsurzklpotcfvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215422.9252408-652-205028418780801/AnsiballZ_systemd.py'
Sep 30 06:57:03 compute-0 sudo[128277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:03 compute-0 python3.9[128300]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Sep 30 06:57:04 compute-0 systemd[1]: Reloading.
Sep 30 06:57:04 compute-0 systemd-rc-local-generator[128707]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:57:04 compute-0 systemd-sysv-generator[128712]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 06:57:04 compute-0 sudo[128277]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:04 compute-0 sudo[129473]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ippoahduvgntbgdaquqtjnslutdymfdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215424.5275962-652-220166659718224/AnsiballZ_systemd.py'
Sep 30 06:57:04 compute-0 sudo[129473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:05 compute-0 python3.9[129491]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Sep 30 06:57:05 compute-0 systemd[1]: Reloading.
Sep 30 06:57:05 compute-0 systemd-rc-local-generator[129841]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:57:05 compute-0 systemd-sysv-generator[129845]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 06:57:05 compute-0 sudo[129473]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:06 compute-0 sudo[130572]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-illppxbtviiqtodiohvoahohrqdqxxmj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215425.7271621-652-79488996080904/AnsiballZ_systemd.py'
Sep 30 06:57:06 compute-0 sudo[130572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:06 compute-0 python3.9[130593]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Sep 30 06:57:06 compute-0 systemd[1]: Reloading.
Sep 30 06:57:06 compute-0 systemd-sysv-generator[131017]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 06:57:06 compute-0 systemd-rc-local-generator[131012]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:57:06 compute-0 sudo[130572]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:07 compute-0 sudo[131684]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izfxlmmnhdekzybaucpenqzkzbrbmuts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215427.053421-652-188458330971183/AnsiballZ_systemd.py'
Sep 30 06:57:07 compute-0 sudo[131684]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:07 compute-0 python3.9[131698]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Sep 30 06:57:07 compute-0 systemd[1]: Reloading.
Sep 30 06:57:07 compute-0 systemd-rc-local-generator[132016]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:57:07 compute-0 systemd-sysv-generator[132023]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 06:57:08 compute-0 sudo[131684]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:08 compute-0 sudo[132780]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uarsgkmtffnlnyjvldtmijmcrblukzpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215428.2623754-710-86783475796227/AnsiballZ_systemd.py'
Sep 30 06:57:08 compute-0 sudo[132780]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:08 compute-0 python3.9[132801]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 06:57:09 compute-0 systemd[1]: Reloading.
Sep 30 06:57:09 compute-0 systemd-sysv-generator[133276]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 06:57:09 compute-0 systemd-rc-local-generator[133272]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:57:09 compute-0 sudo[132780]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:09 compute-0 sudo[133787]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owmgspejopheexxshoifscvmnhrdwrxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215429.4750824-710-263729597993423/AnsiballZ_systemd.py'
Sep 30 06:57:09 compute-0 sudo[133787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:10 compute-0 python3.9[133811]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 06:57:10 compute-0 systemd[1]: Reloading.
Sep 30 06:57:10 compute-0 systemd-rc-local-generator[134167]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:57:10 compute-0 systemd-sysv-generator[134171]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 06:57:10 compute-0 sudo[133787]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:11 compute-0 sudo[134440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-phigogdledhdjiagbulxmxgzzproyazl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215430.8276362-710-277361909077283/AnsiballZ_systemd.py'
Sep 30 06:57:11 compute-0 sudo[134440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:11 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Sep 30 06:57:11 compute-0 systemd[1]: Finished man-db-cache-update.service.
Sep 30 06:57:11 compute-0 systemd[1]: man-db-cache-update.service: Consumed 13.156s CPU time.
Sep 30 06:57:11 compute-0 systemd[1]: run-r58b47a24698847b798aac62757968831.service: Deactivated successfully.
Sep 30 06:57:11 compute-0 python3.9[134442]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 06:57:11 compute-0 systemd[1]: Reloading.
Sep 30 06:57:11 compute-0 systemd-rc-local-generator[134472]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:57:11 compute-0 systemd-sysv-generator[134476]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 06:57:11 compute-0 sudo[134440]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:12 compute-0 sudo[134630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uaxdtncozvskrcbngrmpeekwtpgasmkm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215432.189994-710-22065671021054/AnsiballZ_systemd.py'
Sep 30 06:57:12 compute-0 sudo[134630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:12 compute-0 python3.9[134632]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 06:57:12 compute-0 sudo[134630]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:13 compute-0 sudo[134785]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjwjmnenqrrxntadyegvfoyulthbbcbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215433.0498528-710-232670162161559/AnsiballZ_systemd.py'
Sep 30 06:57:13 compute-0 sudo[134785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:13 compute-0 python3.9[134787]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 06:57:13 compute-0 systemd[1]: Reloading.
Sep 30 06:57:13 compute-0 systemd-rc-local-generator[134819]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:57:14 compute-0 systemd-sysv-generator[134822]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 06:57:14 compute-0 sudo[134785]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:14 compute-0 sudo[134976]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ukhlqyfmqiluqztvazdggxitmpslafsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215434.4274788-782-177952920491397/AnsiballZ_systemd.py'
Sep 30 06:57:14 compute-0 sudo[134976]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:15 compute-0 python3.9[134978]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Sep 30 06:57:16 compute-0 systemd[1]: Reloading.
Sep 30 06:57:16 compute-0 systemd-rc-local-generator[135004]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:57:16 compute-0 systemd-sysv-generator[135008]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 06:57:16 compute-0 systemd[1]: Listening on libvirt proxy daemon socket.
Sep 30 06:57:16 compute-0 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Sep 30 06:57:16 compute-0 sudo[134976]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:17 compute-0 sudo[135169]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gkdqgzoistobwlyzmmtxfsorvqyifdcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215436.876798-798-11752986170600/AnsiballZ_systemd.py'
Sep 30 06:57:17 compute-0 sudo[135169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:17 compute-0 python3.9[135171]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 06:57:17 compute-0 sudo[135169]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:18 compute-0 sudo[135324]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncdhokcgtccdgmtqpxstctqooptucwen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215437.8367417-798-117872168685002/AnsiballZ_systemd.py'
Sep 30 06:57:18 compute-0 sudo[135324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:18 compute-0 python3.9[135326]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 06:57:18 compute-0 sudo[135324]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:18 compute-0 podman[135330]: 2025-09-30 06:57:18.823091441 +0000 UTC m=+0.155409427 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20250930)
Sep 30 06:57:19 compute-0 sudo[135503]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzkjwtppvmpbtucvbpgyzuhdugcbgfmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215438.8625343-798-71129219276441/AnsiballZ_systemd.py'
Sep 30 06:57:19 compute-0 sudo[135503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:19 compute-0 python3.9[135505]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 06:57:19 compute-0 sudo[135503]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:20 compute-0 sudo[135658]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgavvmszdcqgermkzwbqwlbxpvvxeksm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215439.7745802-798-255682063317509/AnsiballZ_systemd.py'
Sep 30 06:57:20 compute-0 sudo[135658]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:20 compute-0 python3.9[135660]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 06:57:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:57:20.495 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 06:57:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:57:20.495 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 06:57:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:57:20.495 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 06:57:20 compute-0 sudo[135658]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:21 compute-0 sudo[135814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgfcshbdlvusecafgrrpitwcvmedxzar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215440.721248-798-68271781139649/AnsiballZ_systemd.py'
Sep 30 06:57:21 compute-0 sudo[135814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:21 compute-0 python3.9[135816]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 06:57:21 compute-0 sudo[135814]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:22 compute-0 sudo[135982]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnkpnwqwqqovckycdxjalireaoroypgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215441.6627097-798-190411304069401/AnsiballZ_systemd.py'
Sep 30 06:57:22 compute-0 sudo[135982]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:22 compute-0 podman[135943]: 2025-09-30 06:57:22.065942781 +0000 UTC m=+0.080259072 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 06:57:22 compute-0 python3.9[135990]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 06:57:22 compute-0 sudo[135982]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:23 compute-0 sudo[136143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqbdzsmcurlkhplcjgwwrvvbgsgvceji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215442.6434517-798-224821333683535/AnsiballZ_systemd.py'
Sep 30 06:57:23 compute-0 sudo[136143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:23 compute-0 python3.9[136145]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 06:57:23 compute-0 sudo[136143]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:24 compute-0 sudo[136298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnbojdrlneogwpysusagikfxjywareev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215443.6180127-798-101627399907202/AnsiballZ_systemd.py'
Sep 30 06:57:24 compute-0 sudo[136298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:24 compute-0 python3.9[136300]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 06:57:24 compute-0 sudo[136298]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:24 compute-0 sudo[136453]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvcbwxxdzmohuifptztbkpeumcyrhosz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215444.5889387-798-29572538952366/AnsiballZ_systemd.py'
Sep 30 06:57:24 compute-0 sudo[136453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:25 compute-0 python3.9[136455]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 06:57:25 compute-0 sudo[136453]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:25 compute-0 sudo[136608]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-incvdprspdaveygjgpsdwbqkscrdoupz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215445.5000947-798-61251946349540/AnsiballZ_systemd.py'
Sep 30 06:57:25 compute-0 sudo[136608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:26 compute-0 python3.9[136610]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 06:57:26 compute-0 sudo[136608]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:26 compute-0 sudo[136763]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iuabphmbnrvmupvhqvtswiwczlipvwzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215446.4937766-798-43434540713316/AnsiballZ_systemd.py'
Sep 30 06:57:26 compute-0 sudo[136763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:27 compute-0 python3.9[136765]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 06:57:27 compute-0 sudo[136763]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:27 compute-0 sudo[136918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqngxtnjscioffwipjhwhgacsauqxacm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215447.484465-798-274492276368028/AnsiballZ_systemd.py'
Sep 30 06:57:27 compute-0 sudo[136918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:28 compute-0 python3.9[136920]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 06:57:28 compute-0 sudo[136918]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:28 compute-0 sudo[137073]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dmceczwvuxrbrpfnofgrivchkhsejotc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215448.4913018-798-154460321293441/AnsiballZ_systemd.py'
Sep 30 06:57:28 compute-0 sudo[137073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:29 compute-0 python3.9[137075]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 06:57:29 compute-0 sudo[137073]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:29 compute-0 sudo[137228]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ayqtecfjswbhhtxdppujhsliinpentbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215449.5293372-798-279438234144753/AnsiballZ_systemd.py'
Sep 30 06:57:29 compute-0 sudo[137228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:30 compute-0 python3.9[137230]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Sep 30 06:57:30 compute-0 sudo[137228]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:31 compute-0 sudo[137383]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlxntudsccdnfjtbfjfbivjljcjnqqdq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215450.8237026-1002-79400986536586/AnsiballZ_file.py'
Sep 30 06:57:31 compute-0 sudo[137383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:31 compute-0 python3.9[137385]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:57:31 compute-0 sudo[137383]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:31 compute-0 sudo[137535]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qkvwlvxuftxzgatgqrglkyuqnvufkoze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215451.5940104-1002-263634725296182/AnsiballZ_file.py'
Sep 30 06:57:31 compute-0 sudo[137535]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:32 compute-0 python3.9[137537]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:57:32 compute-0 sudo[137535]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:32 compute-0 sudo[137687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mzookvqezpleidzskhdovvakygaytovj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215452.38398-1002-223878113617215/AnsiballZ_file.py'
Sep 30 06:57:32 compute-0 sudo[137687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:32 compute-0 python3.9[137689]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:57:32 compute-0 sudo[137687]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:33 compute-0 sudo[137839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ibcurxnjghmcyudnhcodpmwyiquhxpjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215453.193584-1002-259961430496472/AnsiballZ_file.py'
Sep 30 06:57:33 compute-0 sudo[137839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:33 compute-0 python3.9[137841]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:57:33 compute-0 sudo[137839]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:34 compute-0 sudo[137991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcicaswovizjyqerqrnbwhcxnkoperle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215454.0177631-1002-65163468035433/AnsiballZ_file.py'
Sep 30 06:57:34 compute-0 sudo[137991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:34 compute-0 python3.9[137993]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:57:34 compute-0 sudo[137991]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:35 compute-0 sudo[138143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-atmkevanztsevxhohjdlhghetoyekazo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215454.7780757-1002-243395675145824/AnsiballZ_file.py'
Sep 30 06:57:35 compute-0 sudo[138143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:35 compute-0 python3.9[138145]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:57:35 compute-0 sudo[138143]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:36 compute-0 sudo[138295]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fnawuvcucsajoccmxmpuacajedcdewee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215455.5653267-1088-92989549706408/AnsiballZ_stat.py'
Sep 30 06:57:36 compute-0 sudo[138295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:36 compute-0 python3.9[138297]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:57:36 compute-0 sudo[138295]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:36 compute-0 sudo[138420]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ecprrrbinwajaafbzdomdkxmuxbykmgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215455.5653267-1088-92989549706408/AnsiballZ_copy.py'
Sep 30 06:57:36 compute-0 sudo[138420]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:37 compute-0 python3.9[138422]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759215455.5653267-1088-92989549706408/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:57:37 compute-0 sudo[138420]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:37 compute-0 sudo[138572]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrhpxatwkxyswnvmyqjnzjwtrnagbumo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215457.2879624-1088-201238272323561/AnsiballZ_stat.py'
Sep 30 06:57:37 compute-0 sudo[138572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:37 compute-0 python3.9[138574]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:57:37 compute-0 sudo[138572]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:38 compute-0 sudo[138697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qmdoxtonuikrohnpbfbnzdllahzemufa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215457.2879624-1088-201238272323561/AnsiballZ_copy.py'
Sep 30 06:57:38 compute-0 sudo[138697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:38 compute-0 python3.9[138699]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759215457.2879624-1088-201238272323561/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:57:38 compute-0 sudo[138697]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:39 compute-0 sudo[138849]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyulouhzrzmkxkyvjbqofwaqgjpwveho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215458.8262835-1088-109696758800783/AnsiballZ_stat.py'
Sep 30 06:57:39 compute-0 sudo[138849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:39 compute-0 python3.9[138851]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:57:39 compute-0 sudo[138849]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:39 compute-0 sudo[138974]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usoxysduestxibnjggtpilgzjbayiqfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215458.8262835-1088-109696758800783/AnsiballZ_copy.py'
Sep 30 06:57:39 compute-0 sudo[138974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:40 compute-0 python3.9[138976]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759215458.8262835-1088-109696758800783/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:57:40 compute-0 sudo[138974]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:40 compute-0 sudo[139126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ssgplfbmrbjaiikgfnbqolrilsoeszhl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215460.3449306-1088-76516844123434/AnsiballZ_stat.py'
Sep 30 06:57:40 compute-0 sudo[139126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:40 compute-0 python3.9[139128]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:57:41 compute-0 sudo[139126]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:41 compute-0 sudo[139251]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jfvayhxaljxplurqtiibrkiivrfbkdli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215460.3449306-1088-76516844123434/AnsiballZ_copy.py'
Sep 30 06:57:41 compute-0 sudo[139251]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:41 compute-0 python3.9[139253]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759215460.3449306-1088-76516844123434/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:57:41 compute-0 sudo[139251]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:42 compute-0 sudo[139403]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhzlcrowsrjuallhzkdbkmisubfvrxeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215461.9660432-1088-195540836061562/AnsiballZ_stat.py'
Sep 30 06:57:42 compute-0 sudo[139403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:42 compute-0 python3.9[139405]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:57:42 compute-0 sudo[139403]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:43 compute-0 sudo[139528]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldtwniizoubytfpftqlujkbbsmowfete ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215461.9660432-1088-195540836061562/AnsiballZ_copy.py'
Sep 30 06:57:43 compute-0 sudo[139528]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:43 compute-0 python3.9[139530]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759215461.9660432-1088-195540836061562/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:57:43 compute-0 sudo[139528]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:43 compute-0 sudo[139680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhwgovtaivzjlnofznzvdsbrqsmyooqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215463.5433583-1088-114453495099690/AnsiballZ_stat.py'
Sep 30 06:57:43 compute-0 sudo[139680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:44 compute-0 python3.9[139682]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:57:44 compute-0 sudo[139680]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:44 compute-0 sudo[139805]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrilrgtiwclcvsivyfyllplzmjswaepg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215463.5433583-1088-114453495099690/AnsiballZ_copy.py'
Sep 30 06:57:44 compute-0 sudo[139805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:44 compute-0 python3.9[139807]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759215463.5433583-1088-114453495099690/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:57:44 compute-0 sudo[139805]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:45 compute-0 sudo[139957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dgmcquapjukmfbxzlrbyryhwsquujpyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215465.0990376-1088-55596475577635/AnsiballZ_stat.py'
Sep 30 06:57:45 compute-0 sudo[139957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:45 compute-0 python3.9[139959]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:57:45 compute-0 sudo[139957]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:46 compute-0 sudo[140080]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-egbodzjwzcdppckqqqemmjkyyxrqvtqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215465.0990376-1088-55596475577635/AnsiballZ_copy.py'
Sep 30 06:57:46 compute-0 sudo[140080]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:46 compute-0 python3.9[140082]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759215465.0990376-1088-55596475577635/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:57:46 compute-0 sudo[140080]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:46 compute-0 sudo[140232]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmgjjusmdastkhimjqavxjauyzompjcn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215466.630297-1088-275144781428260/AnsiballZ_stat.py'
Sep 30 06:57:46 compute-0 sudo[140232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:47 compute-0 python3.9[140234]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:57:47 compute-0 sudo[140232]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:47 compute-0 sudo[140357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njjlaxzbldlxxncpvcioovlfqbkgzqui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215466.630297-1088-275144781428260/AnsiballZ_copy.py'
Sep 30 06:57:47 compute-0 sudo[140357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:47 compute-0 python3.9[140359]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759215466.630297-1088-275144781428260/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:57:47 compute-0 sudo[140357]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:48 compute-0 sudo[140509]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkmbdgfgmitidcyrfogjxfubodmisoul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215468.1997812-1314-76359915298345/AnsiballZ_command.py'
Sep 30 06:57:48 compute-0 sudo[140509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:48 compute-0 python3.9[140511]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Sep 30 06:57:48 compute-0 sudo[140509]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:49 compute-0 podman[140612]: 2025-09-30 06:57:49.573707555 +0000 UTC m=+0.144629252 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250930, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Sep 30 06:57:49 compute-0 sudo[140690]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lswxkxmjkcpgyahinwqgryczetxhtgnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215469.2074296-1332-156913320096623/AnsiballZ_file.py'
Sep 30 06:57:49 compute-0 sudo[140690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:49 compute-0 python3.9[140692]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:57:49 compute-0 sudo[140690]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:50 compute-0 sudo[140842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axzosjvnonbguperzoswgauckdcsgkmr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215470.030411-1332-192503888923227/AnsiballZ_file.py'
Sep 30 06:57:50 compute-0 sudo[140842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:50 compute-0 python3.9[140844]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:57:50 compute-0 sudo[140842]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:51 compute-0 sudo[140994]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uivhkakfwgffvtupeqgnmqppxkoyllyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215470.8092453-1332-109688805096147/AnsiballZ_file.py'
Sep 30 06:57:51 compute-0 sudo[140994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:51 compute-0 python3.9[140996]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:57:51 compute-0 sudo[140994]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:51 compute-0 sudo[141146]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzovkmgogzfzpibydpsvvfrrocgtzbej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215471.6341922-1332-214319118749360/AnsiballZ_file.py'
Sep 30 06:57:51 compute-0 sudo[141146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:52 compute-0 python3.9[141148]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:57:52 compute-0 sudo[141146]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:52 compute-0 podman[141196]: 2025-09-30 06:57:52.47958338 +0000 UTC m=+0.067516963 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 06:57:52 compute-0 sudo[141318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqgjycaqolmhufridnxutcevdwanibdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215472.3898692-1332-8525933957815/AnsiballZ_file.py'
Sep 30 06:57:52 compute-0 sudo[141318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:52 compute-0 python3.9[141320]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:57:52 compute-0 sudo[141318]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:53 compute-0 sudo[141470]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwzluwyvqfcfrifgrfcvsmtamiszplpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215473.1701703-1332-47451863667702/AnsiballZ_file.py'
Sep 30 06:57:53 compute-0 sudo[141470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:53 compute-0 python3.9[141472]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:57:53 compute-0 sudo[141470]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:54 compute-0 sudo[141622]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wankhslxeekfvnxqdupupuyxwfeehyhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215473.9730341-1332-270644841677658/AnsiballZ_file.py'
Sep 30 06:57:54 compute-0 sudo[141622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:54 compute-0 python3.9[141624]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:57:54 compute-0 sudo[141622]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:55 compute-0 sudo[141774]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-raivgxflhoyajmqhdfzujjpyhnbatkwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215474.8072574-1332-3739600985274/AnsiballZ_file.py'
Sep 30 06:57:55 compute-0 sudo[141774]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:55 compute-0 python3.9[141776]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:57:55 compute-0 sudo[141774]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:55 compute-0 sudo[141926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqnyagfcuabjsdwelvnjjoommrchfgvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215475.5860007-1332-173948747396695/AnsiballZ_file.py'
Sep 30 06:57:55 compute-0 sudo[141926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:56 compute-0 python3.9[141928]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:57:56 compute-0 sudo[141926]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:56 compute-0 sudo[142078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vdaqvztbuspdykxshppyimyppfhjvmnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215476.396228-1332-16576749158558/AnsiballZ_file.py'
Sep 30 06:57:56 compute-0 sudo[142078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:56 compute-0 python3.9[142080]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:57:57 compute-0 sudo[142078]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:57 compute-0 sudo[142230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lucqvxesbfuqkdtsypvssukhjjduvryj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215477.1991713-1332-136232980541914/AnsiballZ_file.py'
Sep 30 06:57:57 compute-0 sudo[142230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:57 compute-0 python3.9[142232]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:57:57 compute-0 sudo[142230]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:58 compute-0 sudo[142382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvtchchdvcurmgyxplmbtogwxeanpvhh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215478.0467181-1332-238748705523903/AnsiballZ_file.py'
Sep 30 06:57:58 compute-0 sudo[142382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:58 compute-0 python3.9[142384]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:57:58 compute-0 sudo[142382]: pam_unix(sudo:session): session closed for user root
Sep 30 06:57:59 compute-0 sudo[142534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmcvebjmjhwzrcvjkyogpwzoybkbsori ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215478.833754-1332-65679279860217/AnsiballZ_file.py'
Sep 30 06:57:59 compute-0 sudo[142534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:57:59 compute-0 python3.9[142536]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:57:59 compute-0 sudo[142534]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:00 compute-0 sudo[142686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrhmggtmbawkskrhkzeborltnkoxzsgk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215479.6633613-1332-179136836347087/AnsiballZ_file.py'
Sep 30 06:58:00 compute-0 sudo[142686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:00 compute-0 python3.9[142688]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:58:00 compute-0 sudo[142686]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:00 compute-0 sudo[142838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-riispgwagojpuazbqhksznqdfjjmsqui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215480.533792-1530-35631007119801/AnsiballZ_stat.py'
Sep 30 06:58:00 compute-0 sudo[142838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:01 compute-0 python3.9[142840]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:58:01 compute-0 sudo[142838]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:01 compute-0 sudo[142961]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmsbqpercsayuanubvwzyopjvdmtzuey ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215480.533792-1530-35631007119801/AnsiballZ_copy.py'
Sep 30 06:58:01 compute-0 sudo[142961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:01 compute-0 python3.9[142963]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759215480.533792-1530-35631007119801/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:58:01 compute-0 sudo[142961]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:02 compute-0 sudo[143113]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtglnxslaqekfvjtwypcaluvgjjlmyqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215482.083558-1530-194767593322915/AnsiballZ_stat.py'
Sep 30 06:58:02 compute-0 sudo[143113]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:02 compute-0 python3.9[143115]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:58:02 compute-0 sudo[143113]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:03 compute-0 sudo[143236]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjuktgrdsedcpmievnacfmddotsgkbdu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215482.083558-1530-194767593322915/AnsiballZ_copy.py'
Sep 30 06:58:03 compute-0 sudo[143236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:03 compute-0 python3.9[143238]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759215482.083558-1530-194767593322915/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:58:03 compute-0 sudo[143236]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:03 compute-0 sudo[143388]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-giyyfruswaltziktdndhqkppjurexezm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215483.5629315-1530-244111360039394/AnsiballZ_stat.py'
Sep 30 06:58:03 compute-0 sudo[143388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:04 compute-0 python3.9[143390]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:58:04 compute-0 sudo[143388]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:04 compute-0 sudo[143511]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lyzttcwjgggbvvnylytvqyaugavpvvng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215483.5629315-1530-244111360039394/AnsiballZ_copy.py'
Sep 30 06:58:04 compute-0 sudo[143511]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:04 compute-0 python3.9[143513]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759215483.5629315-1530-244111360039394/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:58:04 compute-0 sudo[143511]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:05 compute-0 sudo[143663]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dujplnzzlhgshennbuqurvpnatpvbbxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215485.0170908-1530-139612871650100/AnsiballZ_stat.py'
Sep 30 06:58:05 compute-0 sudo[143663]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:05 compute-0 python3.9[143665]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:58:05 compute-0 sudo[143663]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:06 compute-0 sudo[143786]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftmzynmegjyzctvcxcxiwvceahtrzrza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215485.0170908-1530-139612871650100/AnsiballZ_copy.py'
Sep 30 06:58:06 compute-0 sudo[143786]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:06 compute-0 python3.9[143788]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759215485.0170908-1530-139612871650100/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:58:06 compute-0 sudo[143786]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:06 compute-0 sudo[143938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wtkowsgrmvqwtejjiaqbbmqjsyhgerqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215486.5466475-1530-177350170899833/AnsiballZ_stat.py'
Sep 30 06:58:06 compute-0 sudo[143938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:07 compute-0 python3.9[143940]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:58:07 compute-0 sudo[143938]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:07 compute-0 sudo[144061]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twkfbpdecqwjgxyfkrkqurpptglbfzua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215486.5466475-1530-177350170899833/AnsiballZ_copy.py'
Sep 30 06:58:07 compute-0 sudo[144061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:07 compute-0 python3.9[144063]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759215486.5466475-1530-177350170899833/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:58:07 compute-0 sudo[144061]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:08 compute-0 sudo[144213]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nvqbdmiouibhtyxymgbltnsufzeqbcii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215488.11354-1530-250055912708419/AnsiballZ_stat.py'
Sep 30 06:58:08 compute-0 sudo[144213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:08 compute-0 python3.9[144215]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:58:08 compute-0 sudo[144213]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:09 compute-0 sudo[144336]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpffrlqmksltykgfmwkhffeyqxyicrxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215488.11354-1530-250055912708419/AnsiballZ_copy.py'
Sep 30 06:58:09 compute-0 sudo[144336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:09 compute-0 python3.9[144338]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759215488.11354-1530-250055912708419/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:58:09 compute-0 sudo[144336]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:10 compute-0 sudo[144488]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbyycazhyvvqctejjdtvvdfwtghyfwkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215489.7663999-1530-121980437627417/AnsiballZ_stat.py'
Sep 30 06:58:10 compute-0 sudo[144488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:10 compute-0 python3.9[144490]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:58:10 compute-0 sudo[144488]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:10 compute-0 sudo[144611]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lklvpsoxcqjnpyhiduzieifwjpbrckyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215489.7663999-1530-121980437627417/AnsiballZ_copy.py'
Sep 30 06:58:10 compute-0 sudo[144611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:11 compute-0 python3.9[144613]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759215489.7663999-1530-121980437627417/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:58:11 compute-0 sudo[144611]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:11 compute-0 sudo[144763]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oprvxwqqtzglrmheqllbyfgitrlfmfky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215491.350112-1530-275918191986078/AnsiballZ_stat.py'
Sep 30 06:58:11 compute-0 sudo[144763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:12 compute-0 python3.9[144765]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:58:12 compute-0 sudo[144763]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:12 compute-0 sudo[144886]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgiflglmsmuwzyzmyoywwkrjdlidtmrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215491.350112-1530-275918191986078/AnsiballZ_copy.py'
Sep 30 06:58:12 compute-0 sudo[144886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:12 compute-0 python3.9[144888]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759215491.350112-1530-275918191986078/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:58:12 compute-0 sudo[144886]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:13 compute-0 sudo[145038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqtomgwaojofgzolgmykhansflacxobu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215492.9632337-1530-167629254261924/AnsiballZ_stat.py'
Sep 30 06:58:13 compute-0 sudo[145038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:13 compute-0 python3.9[145040]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:58:13 compute-0 sudo[145038]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:14 compute-0 sudo[145161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jwdoorostnedrkwuusfxskbihedujiuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215492.9632337-1530-167629254261924/AnsiballZ_copy.py'
Sep 30 06:58:14 compute-0 sudo[145161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:14 compute-0 python3.9[145163]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759215492.9632337-1530-167629254261924/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:58:14 compute-0 sudo[145161]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:14 compute-0 sudo[145313]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-glhulzhlmfypvxjeamkipejdhaesvpkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215494.4491327-1530-76086951606581/AnsiballZ_stat.py'
Sep 30 06:58:14 compute-0 sudo[145313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:15 compute-0 python3.9[145315]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:58:15 compute-0 sudo[145313]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:15 compute-0 sudo[145436]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esmcjovlbwwcdaednihcmwinmdkoyfgk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215494.4491327-1530-76086951606581/AnsiballZ_copy.py'
Sep 30 06:58:15 compute-0 sudo[145436]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:15 compute-0 python3.9[145438]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759215494.4491327-1530-76086951606581/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:58:15 compute-0 sudo[145436]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:16 compute-0 sudo[145588]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfigoqkvwyndgbzktermtmiwscukydfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215495.9322202-1530-68652215225829/AnsiballZ_stat.py'
Sep 30 06:58:16 compute-0 sudo[145588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:16 compute-0 python3.9[145590]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:58:16 compute-0 sudo[145588]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:16 compute-0 sudo[145711]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gsulsvrneggwcpqvtcznoadrorvvspxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215495.9322202-1530-68652215225829/AnsiballZ_copy.py'
Sep 30 06:58:16 compute-0 sudo[145711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:17 compute-0 python3.9[145713]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759215495.9322202-1530-68652215225829/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:58:17 compute-0 sudo[145711]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:17 compute-0 sudo[145863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xibogwjccyncultfqknycjpywwamypak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215497.3528-1530-34288312021793/AnsiballZ_stat.py'
Sep 30 06:58:17 compute-0 sudo[145863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:17 compute-0 python3.9[145865]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:58:17 compute-0 sudo[145863]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:18 compute-0 sudo[145986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oyrdihhacdpnswyeyabngyvvasnvecix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215497.3528-1530-34288312021793/AnsiballZ_copy.py'
Sep 30 06:58:18 compute-0 sudo[145986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:18 compute-0 python3.9[145988]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759215497.3528-1530-34288312021793/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:58:18 compute-0 sudo[145986]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:19 compute-0 sudo[146138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fyihtierhwuwfrnblymqumxwxawpyboa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215498.907152-1530-31940251106219/AnsiballZ_stat.py'
Sep 30 06:58:19 compute-0 sudo[146138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:19 compute-0 python3.9[146140]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:58:19 compute-0 sudo[146138]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:19 compute-0 sudo[146277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mojjawyoqsrxitlwwdngeeigbzyegcpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215498.907152-1530-31940251106219/AnsiballZ_copy.py'
Sep 30 06:58:20 compute-0 sudo[146277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:20 compute-0 podman[146235]: 2025-09-30 06:58:20.049765977 +0000 UTC m=+0.141573815 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 06:58:20 compute-0 python3.9[146285]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759215498.907152-1530-31940251106219/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:58:20 compute-0 sudo[146277]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:58:20.496 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 06:58:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:58:20.496 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 06:58:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:58:20.496 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 06:58:20 compute-0 sudo[146439]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chwidysvcdtvxiyhlwbkowqwgqsicres ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215500.407339-1530-210959792561279/AnsiballZ_stat.py'
Sep 30 06:58:20 compute-0 sudo[146439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:21 compute-0 python3.9[146441]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:58:21 compute-0 sudo[146439]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:21 compute-0 sudo[146562]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbwoslldvjlumlcgnfnqcjimjxafwscg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215500.407339-1530-210959792561279/AnsiballZ_copy.py'
Sep 30 06:58:21 compute-0 sudo[146562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:21 compute-0 python3.9[146564]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759215500.407339-1530-210959792561279/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:58:21 compute-0 sudo[146562]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:22 compute-0 python3.9[146714]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:58:23 compute-0 sudo[146884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qtilpqnonxlvhubxtahwieyzswegadxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215502.9027812-1942-98214647529183/AnsiballZ_seboolean.py'
Sep 30 06:58:23 compute-0 sudo[146884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:23 compute-0 podman[146841]: 2025-09-30 06:58:23.491323884 +0000 UTC m=+0.085767079 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 06:58:23 compute-0 python3.9[146888]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Sep 30 06:58:24 compute-0 sudo[146884]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:25 compute-0 sudo[147043]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xznsaxmzwaleytlspunnbjbprvtqdbkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215505.0969918-1958-54126083143376/AnsiballZ_copy.py'
Sep 30 06:58:25 compute-0 dbus-broker-launch[814]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Sep 30 06:58:25 compute-0 sudo[147043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:25 compute-0 python3.9[147045]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:58:25 compute-0 sudo[147043]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:26 compute-0 sudo[147195]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgjlqemyofbakjzthcbctaosqktqrhbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215505.9383543-1958-60191322590382/AnsiballZ_copy.py'
Sep 30 06:58:26 compute-0 sudo[147195]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:26 compute-0 python3.9[147197]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:58:26 compute-0 sudo[147195]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:27 compute-0 sudo[147347]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbrrlkfbpekxfglbcicpejmaqoqrvsnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215506.6767607-1958-173416033985140/AnsiballZ_copy.py'
Sep 30 06:58:27 compute-0 sudo[147347]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:27 compute-0 python3.9[147349]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:58:27 compute-0 sudo[147347]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:27 compute-0 sudo[147499]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-salhuqxadmljgxpabnjrpqrsefyrxpgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215507.4164531-1958-121000818109844/AnsiballZ_copy.py'
Sep 30 06:58:27 compute-0 sudo[147499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:28 compute-0 python3.9[147501]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:58:28 compute-0 sudo[147499]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:28 compute-0 sudo[147651]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abqhnkftsulbeakqjpuaxfuhiadcwhfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215508.2324724-1958-154430868731539/AnsiballZ_copy.py'
Sep 30 06:58:28 compute-0 sudo[147651]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:28 compute-0 python3.9[147653]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:58:28 compute-0 sudo[147651]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:29 compute-0 sudo[147803]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eredfmydjqhaludacdrdmwkompiagmkf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215509.1288197-2030-25659134357189/AnsiballZ_copy.py'
Sep 30 06:58:29 compute-0 sudo[147803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:29 compute-0 python3.9[147805]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:58:29 compute-0 sudo[147803]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:30 compute-0 sudo[147955]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ykkjjcqxynpayagpfgoboqxpcykkhkew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215509.949429-2030-216700194126667/AnsiballZ_copy.py'
Sep 30 06:58:30 compute-0 sudo[147955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:30 compute-0 python3.9[147957]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:58:30 compute-0 sudo[147955]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:31 compute-0 sudo[148107]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yomfgtcwnqveuvdgpikpabptcffygjpt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215510.792055-2030-41823272334078/AnsiballZ_copy.py'
Sep 30 06:58:31 compute-0 sudo[148107]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:31 compute-0 python3.9[148109]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:58:31 compute-0 sudo[148107]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:32 compute-0 sudo[148259]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqfboypifgslckdpgqlupnxfxgpexxhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215511.7140694-2030-232864777809739/AnsiballZ_copy.py'
Sep 30 06:58:32 compute-0 sudo[148259]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:32 compute-0 python3.9[148261]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:58:32 compute-0 sudo[148259]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:32 compute-0 sudo[148411]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfzumbqrkbvcsxxvvdprbkfpwvjhtfcs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215512.480624-2030-28778398871822/AnsiballZ_copy.py'
Sep 30 06:58:32 compute-0 sudo[148411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:33 compute-0 python3.9[148413]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:58:33 compute-0 sudo[148411]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:33 compute-0 sudo[148563]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gditqauwsksxtkzvbkiovlrbjbccwdoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215513.3738787-2102-116739343097998/AnsiballZ_systemd.py'
Sep 30 06:58:33 compute-0 sudo[148563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:34 compute-0 python3.9[148565]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 06:58:34 compute-0 systemd[1]: Reloading.
Sep 30 06:58:34 compute-0 systemd-rc-local-generator[148594]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:58:34 compute-0 systemd-sysv-generator[148598]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 06:58:34 compute-0 systemd[1]: Starting libvirt logging daemon socket...
Sep 30 06:58:34 compute-0 systemd[1]: Listening on libvirt logging daemon socket.
Sep 30 06:58:34 compute-0 systemd[1]: Starting libvirt logging daemon admin socket...
Sep 30 06:58:34 compute-0 systemd[1]: Listening on libvirt logging daemon admin socket.
Sep 30 06:58:34 compute-0 systemd[1]: Starting libvirt logging daemon...
Sep 30 06:58:34 compute-0 systemd[1]: Started libvirt logging daemon.
Sep 30 06:58:34 compute-0 sudo[148563]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:35 compute-0 sudo[148757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdyzykqdtnxapyvfvedjsvjuwvjdckxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215514.703818-2102-143168641589107/AnsiballZ_systemd.py'
Sep 30 06:58:35 compute-0 sudo[148757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:35 compute-0 python3.9[148759]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 06:58:35 compute-0 systemd[1]: Reloading.
Sep 30 06:58:35 compute-0 systemd-rc-local-generator[148782]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:58:35 compute-0 systemd-sysv-generator[148786]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 06:58:35 compute-0 systemd[1]: Starting libvirt nodedev daemon socket...
Sep 30 06:58:35 compute-0 systemd[1]: Listening on libvirt nodedev daemon socket.
Sep 30 06:58:35 compute-0 systemd[1]: Starting libvirt nodedev daemon admin socket...
Sep 30 06:58:35 compute-0 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Sep 30 06:58:35 compute-0 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Sep 30 06:58:35 compute-0 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Sep 30 06:58:35 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Sep 30 06:58:35 compute-0 systemd[1]: Started libvirt nodedev daemon.
Sep 30 06:58:35 compute-0 sudo[148757]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:36 compute-0 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Sep 30 06:58:36 compute-0 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Sep 30 06:58:36 compute-0 sudo[148972]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bexcyubbwudulouchbaajhsqvxnyhcvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215516.130843-2102-210760404289413/AnsiballZ_systemd.py'
Sep 30 06:58:36 compute-0 sudo[148972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:36 compute-0 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Sep 30 06:58:36 compute-0 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Sep 30 06:58:36 compute-0 python3.9[148974]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 06:58:36 compute-0 systemd[1]: Reloading.
Sep 30 06:58:36 compute-0 systemd-sysv-generator[149011]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 06:58:36 compute-0 systemd-rc-local-generator[149008]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:58:37 compute-0 systemd[1]: Starting libvirt proxy daemon admin socket...
Sep 30 06:58:37 compute-0 systemd[1]: Starting libvirt proxy daemon read-only socket...
Sep 30 06:58:37 compute-0 systemd[1]: Listening on libvirt proxy daemon admin socket.
Sep 30 06:58:37 compute-0 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Sep 30 06:58:37 compute-0 systemd[1]: Starting libvirt proxy daemon...
Sep 30 06:58:37 compute-0 systemd[1]: Started libvirt proxy daemon.
Sep 30 06:58:37 compute-0 sudo[148972]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:37 compute-0 setroubleshoot[148846]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 2c98efa2-123b-498f-a83e-cb4681173aa3
Sep 30 06:58:37 compute-0 setroubleshoot[148846]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Sep 30 06:58:37 compute-0 setroubleshoot[148846]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 2c98efa2-123b-498f-a83e-cb4681173aa3
Sep 30 06:58:37 compute-0 setroubleshoot[148846]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Sep 30 06:58:37 compute-0 sudo[149191]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frwwgrfpavkspkrewnmyqghkpktadokw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215517.3968744-2102-263830556744905/AnsiballZ_systemd.py'
Sep 30 06:58:37 compute-0 sudo[149191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:38 compute-0 python3.9[149193]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 06:58:38 compute-0 systemd[1]: Reloading.
Sep 30 06:58:38 compute-0 systemd-rc-local-generator[149219]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:58:38 compute-0 systemd-sysv-generator[149224]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 06:58:38 compute-0 systemd[1]: Listening on libvirt locking daemon socket.
Sep 30 06:58:38 compute-0 systemd[1]: Starting libvirt QEMU daemon socket...
Sep 30 06:58:38 compute-0 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Sep 30 06:58:38 compute-0 systemd[1]: Starting Virtual Machine and Container Registration Service...
Sep 30 06:58:38 compute-0 systemd[1]: Listening on libvirt QEMU daemon socket.
Sep 30 06:58:38 compute-0 systemd[1]: Starting libvirt QEMU daemon admin socket...
Sep 30 06:58:38 compute-0 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Sep 30 06:58:38 compute-0 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Sep 30 06:58:38 compute-0 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Sep 30 06:58:38 compute-0 systemd[1]: Started Virtual Machine and Container Registration Service.
Sep 30 06:58:38 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Sep 30 06:58:38 compute-0 systemd[1]: Started libvirt QEMU daemon.
Sep 30 06:58:38 compute-0 sudo[149191]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:39 compute-0 sudo[149404]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qnyzavkuurpjwuomehnzfktpaaufzikd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215518.797231-2102-19970861524839/AnsiballZ_systemd.py'
Sep 30 06:58:39 compute-0 sudo[149404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:39 compute-0 python3.9[149406]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 06:58:39 compute-0 systemd[1]: Reloading.
Sep 30 06:58:39 compute-0 systemd-rc-local-generator[149434]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:58:39 compute-0 systemd-sysv-generator[149438]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 06:58:39 compute-0 systemd[1]: Starting libvirt secret daemon socket...
Sep 30 06:58:39 compute-0 systemd[1]: Listening on libvirt secret daemon socket.
Sep 30 06:58:39 compute-0 systemd[1]: Starting libvirt secret daemon admin socket...
Sep 30 06:58:39 compute-0 systemd[1]: Starting libvirt secret daemon read-only socket...
Sep 30 06:58:39 compute-0 systemd[1]: Listening on libvirt secret daemon admin socket.
Sep 30 06:58:39 compute-0 systemd[1]: Listening on libvirt secret daemon read-only socket.
Sep 30 06:58:39 compute-0 systemd[1]: Starting libvirt secret daemon...
Sep 30 06:58:39 compute-0 systemd[1]: Started libvirt secret daemon.
Sep 30 06:58:40 compute-0 sudo[149404]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:40 compute-0 sudo[149614]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hummtfofrclmzwgrifpxipkkyvcfdpff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215520.3290918-2176-184367053736440/AnsiballZ_file.py'
Sep 30 06:58:40 compute-0 sudo[149614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:40 compute-0 python3.9[149616]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:58:40 compute-0 sudo[149614]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:41 compute-0 sudo[149766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbdlcmjwlyodukgfrpclpprvmnjycece ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215521.1807656-2192-151037537600254/AnsiballZ_find.py'
Sep 30 06:58:41 compute-0 sudo[149766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:41 compute-0 python3.9[149768]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Sep 30 06:58:41 compute-0 sudo[149766]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:42 compute-0 sudo[149918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wihvnpdwfijzcmpmytenavpojpjywbiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215522.394102-2220-205129776255660/AnsiballZ_stat.py'
Sep 30 06:58:42 compute-0 sudo[149918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:43 compute-0 python3.9[149920]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:58:43 compute-0 sudo[149918]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:43 compute-0 sudo[150041]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsmguxrmyduyfwzxtfuygzyfnjcjgzwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215522.394102-2220-205129776255660/AnsiballZ_copy.py'
Sep 30 06:58:43 compute-0 sudo[150041]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:43 compute-0 python3.9[150043]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1759215522.394102-2220-205129776255660/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:58:43 compute-0 sudo[150041]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:44 compute-0 sudo[150193]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnkyxzlisdagkyhiqhmlsofplxsvsvqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215524.116669-2252-104906300941860/AnsiballZ_file.py'
Sep 30 06:58:44 compute-0 sudo[150193]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:44 compute-0 python3.9[150195]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:58:44 compute-0 sudo[150193]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:45 compute-0 sudo[150345]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-advfnfsbtvgcqcnkmfcfilwjhtgdmuox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215525.0439413-2268-228198238021334/AnsiballZ_stat.py'
Sep 30 06:58:45 compute-0 sudo[150345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:45 compute-0 python3.9[150347]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:58:45 compute-0 sudo[150345]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:46 compute-0 sudo[150423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrotivdalrhwzegvtjcgfqhcivzqlbwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215525.0439413-2268-228198238021334/AnsiballZ_file.py'
Sep 30 06:58:46 compute-0 sudo[150423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:46 compute-0 python3.9[150425]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:58:46 compute-0 sudo[150423]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:46 compute-0 sudo[150575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmmmlwvvpsrbpypnpzepecfhumecgovg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215526.5733528-2292-65066642482179/AnsiballZ_stat.py'
Sep 30 06:58:46 compute-0 sudo[150575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:47 compute-0 python3.9[150577]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:58:47 compute-0 sudo[150575]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:47 compute-0 sudo[150653]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahlbqpkkyiwdsypihkjtsditbtkiktkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215526.5733528-2292-65066642482179/AnsiballZ_file.py'
Sep 30 06:58:47 compute-0 sudo[150653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:47 compute-0 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Sep 30 06:58:47 compute-0 python3.9[150655]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.4tvkls4y recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:58:47 compute-0 sudo[150653]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:47 compute-0 systemd[1]: setroubleshootd.service: Deactivated successfully.
Sep 30 06:58:48 compute-0 sudo[150805]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezozeryvjvuzjusrdxpadnvbukhdtmzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215527.9250672-2316-50715666610818/AnsiballZ_stat.py'
Sep 30 06:58:48 compute-0 sudo[150805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:48 compute-0 python3.9[150807]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:58:48 compute-0 sudo[150805]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:48 compute-0 sudo[150883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjsaahibqbwritjirsbppxezgjjsdojk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215527.9250672-2316-50715666610818/AnsiballZ_file.py'
Sep 30 06:58:48 compute-0 sudo[150883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:49 compute-0 python3.9[150885]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:58:49 compute-0 sudo[150883]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:49 compute-0 sudo[151035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wlaqtfhexymjxqcmgypgguhfoqszzxdu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215529.5056622-2342-131538747400708/AnsiballZ_command.py'
Sep 30 06:58:49 compute-0 sudo[151035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:50 compute-0 python3.9[151037]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:58:50 compute-0 sudo[151035]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:50 compute-0 podman[151086]: 2025-09-30 06:58:50.534594359 +0000 UTC m=+0.115574145 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Sep 30 06:58:50 compute-0 sudo[151214]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjgiepnvxgwbhvysjkqvwjdoaslwfhme ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759215530.3895822-2358-83723856862192/AnsiballZ_edpm_nftables_from_files.py'
Sep 30 06:58:50 compute-0 sudo[151214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:51 compute-0 python3[151216]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Sep 30 06:58:51 compute-0 sudo[151214]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:51 compute-0 sudo[151366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wlmginjrrxthpekknxdjgogqrwyfoehu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215531.5015414-2374-48021699102400/AnsiballZ_stat.py'
Sep 30 06:58:51 compute-0 sudo[151366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:52 compute-0 python3.9[151368]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:58:52 compute-0 sudo[151366]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:52 compute-0 sudo[151444]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkjwabbmvqddspzemyoasjvecngyiwpg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215531.5015414-2374-48021699102400/AnsiballZ_file.py'
Sep 30 06:58:52 compute-0 sudo[151444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:52 compute-0 python3.9[151446]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:58:52 compute-0 sudo[151444]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:53 compute-0 sudo[151596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tlrrdbmcfjjbhbxrpmfxieuuydooxvjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215533.033482-2398-181977925628061/AnsiballZ_stat.py'
Sep 30 06:58:53 compute-0 sudo[151596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:53 compute-0 python3.9[151598]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:58:53 compute-0 sudo[151596]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:53 compute-0 podman[151648]: 2025-09-30 06:58:53.986200041 +0000 UTC m=+0.050034702 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent)
Sep 30 06:58:53 compute-0 sudo[151693]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftcsjrpbrkiserrudmalvvmvzvjxndde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215533.033482-2398-181977925628061/AnsiballZ_file.py'
Sep 30 06:58:54 compute-0 sudo[151693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:54 compute-0 python3.9[151695]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:58:54 compute-0 sudo[151693]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:54 compute-0 sudo[151845]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijakqjbgczgbumuefgkubkwucfyxlbrv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215534.4605596-2422-256486406061411/AnsiballZ_stat.py'
Sep 30 06:58:54 compute-0 sudo[151845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:55 compute-0 python3.9[151847]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:58:55 compute-0 sudo[151845]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:55 compute-0 sudo[151923]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xurwxvwneykvkqntqimjkmrimddsvzly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215534.4605596-2422-256486406061411/AnsiballZ_file.py'
Sep 30 06:58:55 compute-0 sudo[151923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:55 compute-0 python3.9[151926]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:58:55 compute-0 sudo[151923]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:56 compute-0 sudo[152077]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkrpdionapvxggoyjyxtlnxuarfrnqod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215535.8604908-2446-206299879021990/AnsiballZ_stat.py'
Sep 30 06:58:56 compute-0 sudo[152077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:56 compute-0 python3.9[152079]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:58:56 compute-0 sudo[152077]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:56 compute-0 sudo[152155]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-junvyoehhoorrktfahlqhslqexmedunb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215535.8604908-2446-206299879021990/AnsiballZ_file.py'
Sep 30 06:58:56 compute-0 sudo[152155]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:56 compute-0 sshd-session[151924]: Invalid user postgres from 80.94.95.116 port 31838
Sep 30 06:58:57 compute-0 sshd-session[151924]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 06:58:57 compute-0 sshd-session[151924]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.95.116
Sep 30 06:58:57 compute-0 python3.9[152157]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:58:57 compute-0 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 06:58:57 compute-0 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 06:58:57 compute-0 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 06:58:57 compute-0 sudo[152155]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:57 compute-0 sudo[152308]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ecokhfqwnelcnptykepoyvgrsajwoewv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215537.282506-2470-20425680808981/AnsiballZ_stat.py'
Sep 30 06:58:57 compute-0 sudo[152308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:57 compute-0 python3.9[152310]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:58:58 compute-0 sudo[152308]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:58 compute-0 sudo[152433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lonevcjxjdpdrmxzaszyomrjfwyhvhmo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215537.282506-2470-20425680808981/AnsiballZ_copy.py'
Sep 30 06:58:58 compute-0 sudo[152433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:58 compute-0 python3.9[152435]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759215537.282506-2470-20425680808981/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:58:58 compute-0 sudo[152433]: pam_unix(sudo:session): session closed for user root
Sep 30 06:58:58 compute-0 sshd-session[151924]: Failed password for invalid user postgres from 80.94.95.116 port 31838 ssh2
Sep 30 06:58:59 compute-0 sudo[152585]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dltdzbbhybkpnwfmbrfflnauvjtrhgni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215538.962184-2500-32003252174485/AnsiballZ_file.py'
Sep 30 06:58:59 compute-0 sudo[152585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:58:59 compute-0 python3.9[152587]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:58:59 compute-0 sudo[152585]: pam_unix(sudo:session): session closed for user root
Sep 30 06:59:00 compute-0 sudo[152737]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yyqduinkcsyhjnaikuzyzfgywlaihogw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215539.8023264-2516-172687258874530/AnsiballZ_command.py'
Sep 30 06:59:00 compute-0 sudo[152737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:59:00 compute-0 python3.9[152739]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:59:00 compute-0 sudo[152737]: pam_unix(sudo:session): session closed for user root
Sep 30 06:59:00 compute-0 sshd-session[151924]: Connection closed by invalid user postgres 80.94.95.116 port 31838 [preauth]
Sep 30 06:59:01 compute-0 sudo[152892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgtyhlvwnxjqvuccwochzxobjwkskjdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215540.6285522-2532-117547541539687/AnsiballZ_blockinfile.py'
Sep 30 06:59:01 compute-0 sudo[152892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:59:01 compute-0 python3.9[152894]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:59:01 compute-0 sudo[152892]: pam_unix(sudo:session): session closed for user root
Sep 30 06:59:02 compute-0 sudo[153044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afiiiyrerymqpvilcwwafzprdchwhodk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215541.701535-2550-257490084760956/AnsiballZ_command.py'
Sep 30 06:59:02 compute-0 sudo[153044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:59:02 compute-0 python3.9[153046]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:59:02 compute-0 sudo[153044]: pam_unix(sudo:session): session closed for user root
Sep 30 06:59:02 compute-0 sudo[153197]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wikxtaukuaavjgcvvroubnqbigsojjqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215542.603011-2566-76391917795431/AnsiballZ_stat.py'
Sep 30 06:59:02 compute-0 sudo[153197]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:59:03 compute-0 python3.9[153199]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 06:59:03 compute-0 sudo[153197]: pam_unix(sudo:session): session closed for user root
Sep 30 06:59:03 compute-0 sudo[153351]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csdkjsggpaqvxbhbqhieqgdzxuwosuke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215543.43658-2582-218853412321252/AnsiballZ_command.py'
Sep 30 06:59:03 compute-0 sudo[153351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:59:03 compute-0 python3.9[153353]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 06:59:04 compute-0 sudo[153351]: pam_unix(sudo:session): session closed for user root
Sep 30 06:59:04 compute-0 sudo[153506]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvuxrskxkjxzwbwjtrvfazcfxkqjgpwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215544.1946251-2598-146303669743709/AnsiballZ_file.py'
Sep 30 06:59:04 compute-0 sudo[153506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:59:04 compute-0 python3.9[153508]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:59:04 compute-0 sudo[153506]: pam_unix(sudo:session): session closed for user root
Sep 30 06:59:05 compute-0 sudo[153658]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzjbufnktvhumctqwuqzbhdueayklmox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215545.0420277-2614-271652720538244/AnsiballZ_stat.py'
Sep 30 06:59:05 compute-0 sudo[153658]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:59:05 compute-0 python3.9[153660]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:59:05 compute-0 sudo[153658]: pam_unix(sudo:session): session closed for user root
Sep 30 06:59:06 compute-0 sudo[153781]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dfxkxqixjybhsumkqgturfbfnotnuivd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215545.0420277-2614-271652720538244/AnsiballZ_copy.py'
Sep 30 06:59:06 compute-0 sudo[153781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:59:06 compute-0 python3.9[153783]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759215545.0420277-2614-271652720538244/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:59:06 compute-0 sudo[153781]: pam_unix(sudo:session): session closed for user root
Sep 30 06:59:06 compute-0 sudo[153933]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfsvmfigvgojoqdyczukppcaapewcqrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215546.5361834-2644-128033466361758/AnsiballZ_stat.py'
Sep 30 06:59:06 compute-0 sudo[153933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:59:07 compute-0 python3.9[153935]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:59:07 compute-0 sudo[153933]: pam_unix(sudo:session): session closed for user root
Sep 30 06:59:07 compute-0 sudo[154056]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikhqrrpcrkpowkizjmrrlsceflgoetng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215546.5361834-2644-128033466361758/AnsiballZ_copy.py'
Sep 30 06:59:07 compute-0 sudo[154056]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:59:07 compute-0 python3.9[154058]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759215546.5361834-2644-128033466361758/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:59:07 compute-0 sudo[154056]: pam_unix(sudo:session): session closed for user root
Sep 30 06:59:08 compute-0 sudo[154208]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omveerkcnhkvvulphxpcaxrqhpchlxpg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215547.8957326-2674-127037302128285/AnsiballZ_stat.py'
Sep 30 06:59:08 compute-0 sudo[154208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:59:08 compute-0 python3.9[154210]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:59:08 compute-0 sudo[154208]: pam_unix(sudo:session): session closed for user root
Sep 30 06:59:08 compute-0 sudo[154331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twsasatwvrvlqivnyestwtpueyfuhglu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215547.8957326-2674-127037302128285/AnsiballZ_copy.py'
Sep 30 06:59:08 compute-0 sudo[154331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:59:09 compute-0 python3.9[154333]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759215547.8957326-2674-127037302128285/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:59:09 compute-0 sudo[154331]: pam_unix(sudo:session): session closed for user root
Sep 30 06:59:09 compute-0 sudo[154483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxidmwpbvtazogmkjdencdbadulkjjeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215549.4342797-2704-162389845713559/AnsiballZ_systemd.py'
Sep 30 06:59:09 compute-0 sudo[154483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:59:10 compute-0 python3.9[154485]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 06:59:10 compute-0 systemd[1]: Reloading.
Sep 30 06:59:10 compute-0 systemd-rc-local-generator[154509]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:59:10 compute-0 systemd-sysv-generator[154514]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 06:59:10 compute-0 systemd[1]: Reached target edpm_libvirt.target.
Sep 30 06:59:10 compute-0 sudo[154483]: pam_unix(sudo:session): session closed for user root
Sep 30 06:59:10 compute-0 sudo[154673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ffybycstmqgasosqmqcclsettucbtjtd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215550.6626835-2720-177416348714082/AnsiballZ_systemd.py'
Sep 30 06:59:10 compute-0 sudo[154673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:59:11 compute-0 python3.9[154675]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Sep 30 06:59:11 compute-0 systemd[1]: Reloading.
Sep 30 06:59:11 compute-0 systemd-sysv-generator[154705]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 06:59:11 compute-0 systemd-rc-local-generator[154702]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:59:11 compute-0 systemd[1]: Reloading.
Sep 30 06:59:11 compute-0 systemd-sysv-generator[154737]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 06:59:11 compute-0 systemd-rc-local-generator[154734]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:59:11 compute-0 sudo[154673]: pam_unix(sudo:session): session closed for user root
Sep 30 06:59:12 compute-0 sshd-session[100448]: Connection closed by 192.168.122.30 port 46414
Sep 30 06:59:12 compute-0 sshd-session[100445]: pam_unix(sshd:session): session closed for user zuul
Sep 30 06:59:12 compute-0 systemd-logind[824]: Session 24 logged out. Waiting for processes to exit.
Sep 30 06:59:12 compute-0 systemd[1]: session-24.scope: Deactivated successfully.
Sep 30 06:59:12 compute-0 systemd[1]: session-24.scope: Consumed 3min 58.324s CPU time.
Sep 30 06:59:12 compute-0 systemd-logind[824]: Removed session 24.
Sep 30 06:59:18 compute-0 sshd-session[154770]: Accepted publickey for zuul from 192.168.122.30 port 53024 ssh2: ECDSA SHA256:VgXY+3KEFg6ByVjpOVk/qpSKqXtLqTtx1W0gQMfs9wE
Sep 30 06:59:18 compute-0 systemd-logind[824]: New session 25 of user zuul.
Sep 30 06:59:18 compute-0 systemd[1]: Started Session 25 of User zuul.
Sep 30 06:59:18 compute-0 sshd-session[154770]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 06:59:19 compute-0 python3.9[154923]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 06:59:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:59:20.498 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 06:59:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:59:20.500 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 06:59:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 06:59:20.500 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 06:59:20 compute-0 sudo[155096]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxayiozlyqkxgxrxrbqnjhokxofvvavk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215560.1587203-48-159865526367036/AnsiballZ_file.py'
Sep 30 06:59:20 compute-0 sudo[155096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:59:20 compute-0 podman[155052]: 2025-09-30 06:59:20.723459527 +0000 UTC m=+0.104186617 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20250930, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Sep 30 06:59:20 compute-0 python3.9[155106]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:59:20 compute-0 sudo[155096]: pam_unix(sudo:session): session closed for user root
Sep 30 06:59:21 compute-0 sudo[155256]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xefpknwsclrsoiejpcnnnxtztkekntyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215561.1255033-48-24458842178230/AnsiballZ_file.py'
Sep 30 06:59:21 compute-0 sudo[155256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:59:21 compute-0 python3.9[155258]: ansible-ansible.builtin.file Invoked with path=/etc/target setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:59:21 compute-0 sudo[155256]: pam_unix(sudo:session): session closed for user root
Sep 30 06:59:22 compute-0 sudo[155408]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxzqqvjgscprbajugtwhkvlmcrytoiub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215561.8915277-48-225331215216995/AnsiballZ_file.py'
Sep 30 06:59:22 compute-0 sudo[155408]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:59:22 compute-0 python3.9[155410]: ansible-ansible.builtin.file Invoked with path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:59:22 compute-0 sudo[155408]: pam_unix(sudo:session): session closed for user root
Sep 30 06:59:22 compute-0 sudo[155560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odmunukotsbrpqkkbofujefqmokwwtbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215562.570781-48-49527150517985/AnsiballZ_file.py'
Sep 30 06:59:22 compute-0 sudo[155560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:59:23 compute-0 python3.9[155562]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/config-data selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Sep 30 06:59:23 compute-0 sudo[155560]: pam_unix(sudo:session): session closed for user root
Sep 30 06:59:23 compute-0 sudo[155712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnjcdqqtiefuhxxfzqrpsptyrdkakphi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215563.4041965-48-28578838158348/AnsiballZ_file.py'
Sep 30 06:59:23 compute-0 sudo[155712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:59:23 compute-0 python3.9[155714]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/config-data/ansible-generated/iscsid setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:59:23 compute-0 sudo[155712]: pam_unix(sudo:session): session closed for user root
Sep 30 06:59:24 compute-0 podman[155798]: 2025-09-30 06:59:24.53244318 +0000 UTC m=+0.104171556 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 06:59:24 compute-0 sudo[155883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxyjnlmyepqarjnggacglthcpirtkcvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215564.1752589-120-21326280293144/AnsiballZ_stat.py'
Sep 30 06:59:24 compute-0 sudo[155883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:59:24 compute-0 python3.9[155885]: ansible-ansible.builtin.stat Invoked with path=/lib/systemd/system/iscsid.socket follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 06:59:24 compute-0 sudo[155883]: pam_unix(sudo:session): session closed for user root
Sep 30 06:59:25 compute-0 sudo[156037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqirkzttxbovgqlzpdruqldsocmvghiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215565.072275-136-221636077815263/AnsiballZ_systemd.py'
Sep 30 06:59:25 compute-0 sudo[156037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:59:26 compute-0 python3.9[156039]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iscsid.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 06:59:26 compute-0 systemd[1]: Reloading.
Sep 30 06:59:26 compute-0 systemd-rc-local-generator[156069]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:59:26 compute-0 systemd-sysv-generator[156074]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 06:59:26 compute-0 sudo[156037]: pam_unix(sudo:session): session closed for user root
Sep 30 06:59:27 compute-0 sudo[156227]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ievpkdhumqmqjaktjsgidlrcrswonkyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215566.73151-152-30102688901746/AnsiballZ_service_facts.py'
Sep 30 06:59:27 compute-0 sudo[156227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:59:27 compute-0 python3.9[156229]: ansible-ansible.builtin.service_facts Invoked
Sep 30 06:59:27 compute-0 network[156246]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Sep 30 06:59:27 compute-0 network[156247]: 'network-scripts' will be removed from distribution in near future.
Sep 30 06:59:27 compute-0 network[156248]: It is advised to switch to 'NetworkManager' instead for network management.
Sep 30 06:59:32 compute-0 sudo[156227]: pam_unix(sudo:session): session closed for user root
Sep 30 06:59:33 compute-0 sudo[156519]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vyqwfyriynhqzwvcqtlrwriktgrpfxgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215573.2613192-168-280826422309220/AnsiballZ_systemd.py'
Sep 30 06:59:33 compute-0 sudo[156519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:59:34 compute-0 python3.9[156521]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iscsi-starter.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 06:59:35 compute-0 systemd[1]: Reloading.
Sep 30 06:59:35 compute-0 systemd-rc-local-generator[156543]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:59:35 compute-0 systemd-sysv-generator[156549]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 06:59:35 compute-0 sudo[156519]: pam_unix(sudo:session): session closed for user root
Sep 30 06:59:36 compute-0 python3.9[156708]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 06:59:37 compute-0 sudo[156858]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqobvixnpylqapkhvyenrrrttvwdwgbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215576.5892966-202-270401499426967/AnsiballZ_podman_container.py'
Sep 30 06:59:37 compute-0 sudo[156858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:59:37 compute-0 python3.9[156860]: ansible-containers.podman.podman_container Invoked with command=/usr/sbin/iscsi-iname detach=False image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest name=iscsid_config rm=True tty=True executable=podman state=started debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Sep 30 06:59:37 compute-0 podman[156896]: 2025-09-30 06:59:37.659850778 +0000 UTC m=+0.060103264 container create d5559e8fd8302a1413e5a50f92ea2eaa67468c50880e86c674e449f65c71078b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid_config, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest)
Sep 30 06:59:37 compute-0 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 06:59:37 compute-0 NetworkManager[51813]: <info>  [1759215577.6952] manager: (podman0): new Bridge device (/org/freedesktop/NetworkManager/Devices/20)
Sep 30 06:59:37 compute-0 kernel: podman0: port 1(veth0) entered blocking state
Sep 30 06:59:37 compute-0 kernel: podman0: port 1(veth0) entered disabled state
Sep 30 06:59:37 compute-0 kernel: veth0: entered allmulticast mode
Sep 30 06:59:37 compute-0 kernel: veth0: entered promiscuous mode
Sep 30 06:59:37 compute-0 NetworkManager[51813]: <info>  [1759215577.7206] manager: (veth0): new Veth device (/org/freedesktop/NetworkManager/Devices/21)
Sep 30 06:59:37 compute-0 kernel: podman0: port 1(veth0) entered blocking state
Sep 30 06:59:37 compute-0 kernel: podman0: port 1(veth0) entered forwarding state
Sep 30 06:59:37 compute-0 NetworkManager[51813]: <info>  [1759215577.7231] device (veth0): carrier: link connected
Sep 30 06:59:37 compute-0 podman[156896]: 2025-09-30 06:59:37.627956901 +0000 UTC m=+0.028209447 image pull 36e09fb90e558c69a5cd1d9e675a0cddae2912ee81c1af712f9b1ec1a4a5791d 38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest
Sep 30 06:59:37 compute-0 NetworkManager[51813]: <info>  [1759215577.7232] device (podman0): carrier: link connected
Sep 30 06:59:37 compute-0 systemd-udevd[156927]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 06:59:37 compute-0 systemd-udevd[156924]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 06:59:37 compute-0 NetworkManager[51813]: <info>  [1759215577.7771] device (podman0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 06:59:37 compute-0 NetworkManager[51813]: <info>  [1759215577.7798] device (podman0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Sep 30 06:59:37 compute-0 NetworkManager[51813]: <info>  [1759215577.7809] device (podman0): Activation: starting connection 'podman0' (e81e2750-8662-444c-bca7-43d24c8368ea)
Sep 30 06:59:37 compute-0 NetworkManager[51813]: <info>  [1759215577.7810] device (podman0): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Sep 30 06:59:37 compute-0 NetworkManager[51813]: <info>  [1759215577.7814] device (podman0): state change: prepare -> config (reason 'none', managed-type: 'external')
Sep 30 06:59:37 compute-0 NetworkManager[51813]: <info>  [1759215577.7815] device (podman0): state change: config -> ip-config (reason 'none', managed-type: 'external')
Sep 30 06:59:37 compute-0 NetworkManager[51813]: <info>  [1759215577.7817] device (podman0): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Sep 30 06:59:37 compute-0 systemd[1]: Starting Network Manager Script Dispatcher Service...
Sep 30 06:59:37 compute-0 systemd[1]: Started Network Manager Script Dispatcher Service.
Sep 30 06:59:37 compute-0 NetworkManager[51813]: <info>  [1759215577.8126] device (podman0): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Sep 30 06:59:37 compute-0 NetworkManager[51813]: <info>  [1759215577.8130] device (podman0): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Sep 30 06:59:37 compute-0 NetworkManager[51813]: <info>  [1759215577.8139] device (podman0): Activation: successful, device activated.
Sep 30 06:59:37 compute-0 systemd[1]: iscsi.service: Unit cannot be reloaded because it is inactive.
Sep 30 06:59:38 compute-0 systemd[1]: Started libpod-conmon-d5559e8fd8302a1413e5a50f92ea2eaa67468c50880e86c674e449f65c71078b.scope.
Sep 30 06:59:38 compute-0 systemd[1]: Started libcrun container.
Sep 30 06:59:38 compute-0 podman[156896]: 2025-09-30 06:59:38.19392678 +0000 UTC m=+0.594179256 container init d5559e8fd8302a1413e5a50f92ea2eaa67468c50880e86c674e449f65c71078b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid_config, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Sep 30 06:59:38 compute-0 podman[156896]: 2025-09-30 06:59:38.203110725 +0000 UTC m=+0.603363181 container start d5559e8fd8302a1413e5a50f92ea2eaa67468c50880e86c674e449f65c71078b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid_config, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Sep 30 06:59:38 compute-0 podman[156896]: 2025-09-30 06:59:38.206628231 +0000 UTC m=+0.606880897 container attach d5559e8fd8302a1413e5a50f92ea2eaa67468c50880e86c674e449f65c71078b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid_config, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Sep 30 06:59:38 compute-0 iscsid_config[157056]: iqn.1994-05.com.redhat:66f2fcc9227
Sep 30 06:59:38 compute-0 systemd[1]: libpod-d5559e8fd8302a1413e5a50f92ea2eaa67468c50880e86c674e449f65c71078b.scope: Deactivated successfully.
Sep 30 06:59:38 compute-0 podman[156896]: 2025-09-30 06:59:38.211438765 +0000 UTC m=+0.611691251 container died d5559e8fd8302a1413e5a50f92ea2eaa67468c50880e86c674e449f65c71078b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid_config, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest)
Sep 30 06:59:38 compute-0 kernel: podman0: port 1(veth0) entered disabled state
Sep 30 06:59:38 compute-0 kernel: veth0 (unregistering): left allmulticast mode
Sep 30 06:59:38 compute-0 kernel: veth0 (unregistering): left promiscuous mode
Sep 30 06:59:38 compute-0 kernel: podman0: port 1(veth0) entered disabled state
Sep 30 06:59:38 compute-0 NetworkManager[51813]: <info>  [1759215578.2653] device (podman0): state change: activated -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 06:59:38 compute-0 systemd[1]: run-netns-netns\x2ddaa437b2\x2df53b\x2df7e5\x2dd5a7\x2d7dd3657f0149.mount: Deactivated successfully.
Sep 30 06:59:38 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d5559e8fd8302a1413e5a50f92ea2eaa67468c50880e86c674e449f65c71078b-userdata-shm.mount: Deactivated successfully.
Sep 30 06:59:38 compute-0 systemd[1]: var-lib-containers-storage-overlay-81e5974546b103ff8a436d60623858fc3e2c8914fbecdd0e9f0e93e2e086211d-merged.mount: Deactivated successfully.
Sep 30 06:59:38 compute-0 podman[156896]: 2025-09-30 06:59:38.68865956 +0000 UTC m=+1.088912056 container remove d5559e8fd8302a1413e5a50f92ea2eaa67468c50880e86c674e449f65c71078b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid_config, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930)
Sep 30 06:59:38 compute-0 python3.9[156860]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman run --name iscsid_config --detach=False --rm --tty=True 38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest /usr/sbin/iscsi-iname
Sep 30 06:59:38 compute-0 systemd[1]: libpod-conmon-d5559e8fd8302a1413e5a50f92ea2eaa67468c50880e86c674e449f65c71078b.scope: Deactivated successfully.
Sep 30 06:59:38 compute-0 python3.9[156860]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: Error generating systemd: 
                                             DEPRECATED command:
                                             It is recommended to use Quadlets for running containers and pods under systemd.
                                             
                                             Please refer to podman-systemd.unit(5) for details.
                                             Error: iscsid_config does not refer to a container or pod: no pod with name or ID iscsid_config found: no such pod: no container with name or ID "iscsid_config" found: no such container
Sep 30 06:59:38 compute-0 sudo[156858]: pam_unix(sudo:session): session closed for user root
Sep 30 06:59:39 compute-0 sudo[157299]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrzndcbquymfwcqpmmyjgonfjyzafaos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215579.1234808-218-105373193157371/AnsiballZ_stat.py'
Sep 30 06:59:39 compute-0 sudo[157299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:59:39 compute-0 python3.9[157301]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:59:39 compute-0 sudo[157299]: pam_unix(sudo:session): session closed for user root
Sep 30 06:59:40 compute-0 sudo[157422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkoilodspsoxmnzayiiukwepegqqnccy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215579.1234808-218-105373193157371/AnsiballZ_copy.py'
Sep 30 06:59:40 compute-0 sudo[157422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:59:40 compute-0 python3.9[157424]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759215579.1234808-218-105373193157371/.source.iscsi _original_basename=.y5_ede_7 follow=False checksum=1a9a61fff8118bbb6893494d22111bd6b61c7838 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:59:40 compute-0 sudo[157422]: pam_unix(sudo:session): session closed for user root
Sep 30 06:59:41 compute-0 sudo[157574]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zgmkcbfswyydetfrjxjutonymhecqcbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215580.7990074-248-97273329375267/AnsiballZ_file.py'
Sep 30 06:59:41 compute-0 sudo[157574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:59:41 compute-0 python3.9[157576]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:59:41 compute-0 sudo[157574]: pam_unix(sudo:session): session closed for user root
Sep 30 06:59:42 compute-0 python3.9[157726]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/iscsid.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 06:59:43 compute-0 sudo[157878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbcwivoixhhmnpnzjoxnjnbaqprpgair ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215582.572029-282-248608228032047/AnsiballZ_lineinfile.py'
Sep 30 06:59:43 compute-0 sudo[157878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:59:43 compute-0 python3.9[157880]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:59:43 compute-0 sudo[157878]: pam_unix(sudo:session): session closed for user root
Sep 30 06:59:44 compute-0 sudo[158030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ipfyvueltvslasuxmcmrlgsewphrauju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215583.7547197-300-27534465970278/AnsiballZ_file.py'
Sep 30 06:59:44 compute-0 sudo[158030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:59:44 compute-0 python3.9[158032]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:59:44 compute-0 sudo[158030]: pam_unix(sudo:session): session closed for user root
Sep 30 06:59:44 compute-0 sudo[158182]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fiitipydajnxhnphptmumoakpwmtuldh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215584.5817735-316-32759948913709/AnsiballZ_stat.py'
Sep 30 06:59:44 compute-0 sudo[158182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:59:45 compute-0 python3.9[158184]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:59:45 compute-0 sudo[158182]: pam_unix(sudo:session): session closed for user root
Sep 30 06:59:45 compute-0 sudo[158260]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtcvijccyvgqgypodadwzbqwedgcvhkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215584.5817735-316-32759948913709/AnsiballZ_file.py'
Sep 30 06:59:45 compute-0 sudo[158260]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:59:45 compute-0 python3.9[158262]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:59:45 compute-0 sudo[158260]: pam_unix(sudo:session): session closed for user root
Sep 30 06:59:46 compute-0 sudo[158412]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlkvhqkvgcyhhtcgfpmnitnhvshzrolo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215585.913833-316-139505960325456/AnsiballZ_stat.py'
Sep 30 06:59:46 compute-0 sudo[158412]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:59:46 compute-0 python3.9[158414]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:59:46 compute-0 sudo[158412]: pam_unix(sudo:session): session closed for user root
Sep 30 06:59:46 compute-0 sudo[158490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrcegivffffshaiephahpgotjqhfrchh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215585.913833-316-139505960325456/AnsiballZ_file.py'
Sep 30 06:59:46 compute-0 sudo[158490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:59:47 compute-0 python3.9[158492]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:59:47 compute-0 sudo[158490]: pam_unix(sudo:session): session closed for user root
Sep 30 06:59:47 compute-0 sudo[158642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syuyxczfhgsmhdfdbfggjpsterhyumwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215587.3073144-362-177092406524797/AnsiballZ_file.py'
Sep 30 06:59:47 compute-0 sudo[158642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:59:47 compute-0 python3.9[158644]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:59:47 compute-0 sudo[158642]: pam_unix(sudo:session): session closed for user root
Sep 30 06:59:48 compute-0 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Sep 30 06:59:48 compute-0 sudo[158794]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lgxpuciwsgyukpxsklquofdwwptxxtos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215588.1154568-378-174770657965223/AnsiballZ_stat.py'
Sep 30 06:59:48 compute-0 sudo[158794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:59:48 compute-0 python3.9[158796]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:59:48 compute-0 sudo[158794]: pam_unix(sudo:session): session closed for user root
Sep 30 06:59:49 compute-0 sudo[158872]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gefvbjcmfzmalpvedgbmmqhygfywcrxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215588.1154568-378-174770657965223/AnsiballZ_file.py'
Sep 30 06:59:49 compute-0 sudo[158872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:59:49 compute-0 python3.9[158874]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:59:49 compute-0 sudo[158872]: pam_unix(sudo:session): session closed for user root
Sep 30 06:59:50 compute-0 sudo[159024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjfzkgbebkeejxelgylygrjqysszjhil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215589.5983918-402-278539103117621/AnsiballZ_stat.py'
Sep 30 06:59:50 compute-0 sudo[159024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:59:50 compute-0 python3.9[159026]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:59:50 compute-0 sudo[159024]: pam_unix(sudo:session): session closed for user root
Sep 30 06:59:50 compute-0 sudo[159102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nerqbypahazfnrrexgrsgkfbgtalidop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215589.5983918-402-278539103117621/AnsiballZ_file.py'
Sep 30 06:59:50 compute-0 sudo[159102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:59:50 compute-0 python3.9[159104]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:59:50 compute-0 sudo[159102]: pam_unix(sudo:session): session closed for user root
Sep 30 06:59:51 compute-0 sudo[159267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qffzezdbrfxjtbkyrrhqnxfyjgmsdeod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215591.0560596-426-142158965715837/AnsiballZ_systemd.py'
Sep 30 06:59:51 compute-0 sudo[159267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:59:51 compute-0 podman[159228]: 2025-09-30 06:59:51.573949226 +0000 UTC m=+0.137471365 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 06:59:51 compute-0 python3.9[159276]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 06:59:51 compute-0 systemd[1]: Reloading.
Sep 30 06:59:51 compute-0 systemd-rc-local-generator[159311]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:59:51 compute-0 systemd-sysv-generator[159314]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 06:59:52 compute-0 sudo[159267]: pam_unix(sudo:session): session closed for user root
Sep 30 06:59:52 compute-0 sudo[159470]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rnbdjajkeuzourzkvsmwuhdmirlkbhwe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215592.4795465-442-79494840436374/AnsiballZ_stat.py'
Sep 30 06:59:52 compute-0 sudo[159470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:59:53 compute-0 python3.9[159472]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:59:53 compute-0 sudo[159470]: pam_unix(sudo:session): session closed for user root
Sep 30 06:59:53 compute-0 sudo[159548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ocgdqjqugncqhxvcbeshqtfqxrevnyjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215592.4795465-442-79494840436374/AnsiballZ_file.py'
Sep 30 06:59:53 compute-0 sudo[159548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:59:53 compute-0 python3.9[159550]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:59:53 compute-0 sudo[159548]: pam_unix(sudo:session): session closed for user root
Sep 30 06:59:54 compute-0 sudo[159700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ddlqcrlbgpevxxyioumalwyqpmqdangz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215593.947172-466-112233333244673/AnsiballZ_stat.py'
Sep 30 06:59:54 compute-0 sudo[159700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:59:54 compute-0 python3.9[159702]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:59:54 compute-0 sudo[159700]: pam_unix(sudo:session): session closed for user root
Sep 30 06:59:54 compute-0 sudo[159791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qobncwewfcftifbsfonllutxhdkfbmuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215593.947172-466-112233333244673/AnsiballZ_file.py'
Sep 30 06:59:54 compute-0 sudo[159791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:59:54 compute-0 podman[159752]: 2025-09-30 06:59:54.963966741 +0000 UTC m=+0.089061073 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Sep 30 06:59:55 compute-0 python3.9[159799]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 06:59:55 compute-0 sudo[159791]: pam_unix(sudo:session): session closed for user root
Sep 30 06:59:55 compute-0 sudo[159949]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bbrpifdhfhxgsircipiecxkbibzbiisp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215595.4758263-490-199414711837797/AnsiballZ_systemd.py'
Sep 30 06:59:55 compute-0 sudo[159949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:59:56 compute-0 python3.9[159951]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 06:59:56 compute-0 systemd[1]: Reloading.
Sep 30 06:59:56 compute-0 systemd-sysv-generator[159983]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 06:59:56 compute-0 systemd-rc-local-generator[159979]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 06:59:56 compute-0 systemd[1]: Starting Create netns directory...
Sep 30 06:59:56 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Sep 30 06:59:56 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Sep 30 06:59:56 compute-0 systemd[1]: Finished Create netns directory.
Sep 30 06:59:56 compute-0 sudo[159949]: pam_unix(sudo:session): session closed for user root
Sep 30 06:59:57 compute-0 sudo[160142]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enqsqywffhnxkzewuhaqiqvpyxqrphhp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215597.0044582-510-170594045815462/AnsiballZ_file.py'
Sep 30 06:59:57 compute-0 sudo[160142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:59:57 compute-0 python3.9[160144]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:59:57 compute-0 sudo[160142]: pam_unix(sudo:session): session closed for user root
Sep 30 06:59:58 compute-0 sudo[160294]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-psyeiishichlofgjlnnxbvrnxecetfyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215597.8320532-526-153082062071547/AnsiballZ_stat.py'
Sep 30 06:59:58 compute-0 sudo[160294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:59:58 compute-0 python3.9[160296]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/iscsid/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 06:59:58 compute-0 sudo[160294]: pam_unix(sudo:session): session closed for user root
Sep 30 06:59:58 compute-0 sudo[160417]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdfhjtemabyvvwglseywvntzfyypdrvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215597.8320532-526-153082062071547/AnsiballZ_copy.py'
Sep 30 06:59:58 compute-0 sudo[160417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 06:59:59 compute-0 python3.9[160419]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/iscsid/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759215597.8320532-526-153082062071547/.source _original_basename=healthcheck follow=False checksum=2e1237e7fe015c809b173c52e24cfb87132f4344 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 06:59:59 compute-0 sudo[160417]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:00 compute-0 sudo[160569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdhhssfcqkulhyasteovexsitvirckrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215599.771409-560-27483769120058/AnsiballZ_file.py'
Sep 30 07:00:00 compute-0 sudo[160569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:00 compute-0 python3.9[160571]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 07:00:00 compute-0 sudo[160569]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:01 compute-0 sudo[160721]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhmhnumbarkoxumfnuflibfvmlahzgqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215600.6660907-576-118879203693124/AnsiballZ_stat.py'
Sep 30 07:00:01 compute-0 sudo[160721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:01 compute-0 python3.9[160723]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/iscsid.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 07:00:01 compute-0 sudo[160721]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:01 compute-0 sudo[160844]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsgkifaktwejpbeuqjcfxsepecrlqlvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215600.6660907-576-118879203693124/AnsiballZ_copy.py'
Sep 30 07:00:01 compute-0 sudo[160844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:01 compute-0 python3.9[160846]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/iscsid.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759215600.6660907-576-118879203693124/.source.json _original_basename=.2_wxo51m follow=False checksum=80e4f97460718c7e5c66b21ef8b846eba0e0dbc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:00:01 compute-0 sudo[160844]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:02 compute-0 sudo[160996]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bpxnwmwcwbcioopgussyqlvowtnvsgzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215602.177193-606-82939037228034/AnsiballZ_file.py'
Sep 30 07:00:02 compute-0 sudo[160996]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:02 compute-0 python3.9[160998]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/iscsid state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:00:02 compute-0 sudo[160996]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:03 compute-0 sudo[161148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hatvgcvfobgoifyqkddxqjennicoetyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215603.0983903-622-97965449210010/AnsiballZ_stat.py'
Sep 30 07:00:03 compute-0 sudo[161148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:03 compute-0 sudo[161148]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:04 compute-0 sudo[161271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aefpqgtkdfufcjmixcinomdpoqhkmqca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215603.0983903-622-97965449210010/AnsiballZ_copy.py'
Sep 30 07:00:04 compute-0 sudo[161271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:04 compute-0 sudo[161271]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:05 compute-0 sudo[161423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfbrzgbnqrzvncdwuziokjcwgzamcybg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215605.0575135-656-139785710098388/AnsiballZ_container_config_data.py'
Sep 30 07:00:05 compute-0 sudo[161423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:05 compute-0 python3.9[161425]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/iscsid config_pattern=*.json debug=False
Sep 30 07:00:05 compute-0 sudo[161423]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:06 compute-0 sudo[161575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-washkijfefmdclsiqnyvpvixdvbjrogb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215606.1564615-674-10335849059253/AnsiballZ_container_config_hash.py'
Sep 30 07:00:06 compute-0 sudo[161575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:06 compute-0 python3.9[161577]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Sep 30 07:00:06 compute-0 sudo[161575]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:07 compute-0 sudo[161727]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqyqnkgixxkctzeujtmmsvzlpzpyhzjl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215607.3058102-692-242340441884583/AnsiballZ_podman_container_info.py'
Sep 30 07:00:07 compute-0 sudo[161727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:08 compute-0 python3.9[161729]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Sep 30 07:00:08 compute-0 sudo[161727]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:09 compute-0 sudo[161905]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wpkavooqcoxrokejjrikufmrnwmfqymx ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759215609.0383208-718-140624566669723/AnsiballZ_edpm_container_manage.py'
Sep 30 07:00:09 compute-0 sudo[161905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:09 compute-0 python3[161907]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/iscsid config_id=iscsid config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Sep 30 07:00:10 compute-0 podman[161945]: 2025-09-30 07:00:10.272420797 +0000 UTC m=+0.074411853 container create d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, config_id=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, container_name=iscsid, org.label-schema.build-date=20250930)
Sep 30 07:00:10 compute-0 podman[161945]: 2025-09-30 07:00:10.235477429 +0000 UTC m=+0.037468525 image pull 36e09fb90e558c69a5cd1d9e675a0cddae2912ee81c1af712f9b1ec1a4a5791d 38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest
Sep 30 07:00:10 compute-0 python3[161907]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name iscsid --conmon-pidfile /run/iscsid.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=iscsid --label container_name=iscsid --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run:/run --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:z --volume /etc/target:/etc/target:z --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /var/lib/openstack/healthchecks/iscsid:/openstack:ro,z 38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest
Sep 30 07:00:10 compute-0 sudo[161905]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:11 compute-0 sudo[162133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfjkrboibfgxemjwkxrflvxintmchlxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215610.738544-734-7983939041935/AnsiballZ_stat.py'
Sep 30 07:00:11 compute-0 sudo[162133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:11 compute-0 python3.9[162135]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 07:00:11 compute-0 sudo[162133]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:12 compute-0 sudo[162287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zikschhhyfduakgbtqhrcurdvilhqnzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215611.7496645-752-200398340172354/AnsiballZ_file.py'
Sep 30 07:00:12 compute-0 sudo[162287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:12 compute-0 python3.9[162289]: ansible-file Invoked with path=/etc/systemd/system/edpm_iscsid.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:00:12 compute-0 sudo[162287]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:12 compute-0 sudo[162363]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctldnassnpbahjadmyogueonoglthljv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215611.7496645-752-200398340172354/AnsiballZ_stat.py'
Sep 30 07:00:12 compute-0 sudo[162363]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:12 compute-0 python3.9[162365]: ansible-stat Invoked with path=/etc/systemd/system/edpm_iscsid_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 07:00:12 compute-0 sudo[162363]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:13 compute-0 sudo[162514]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilifzeblzvcwinnertwvbqikvcvyfddv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215613.0113387-752-92441758749223/AnsiballZ_copy.py'
Sep 30 07:00:13 compute-0 sudo[162514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:13 compute-0 python3.9[162516]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759215613.0113387-752-92441758749223/source dest=/etc/systemd/system/edpm_iscsid.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:00:13 compute-0 sudo[162514]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:14 compute-0 sudo[162592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qgaubhdqephmaaevzpckbpnpbnvkjuag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215613.0113387-752-92441758749223/AnsiballZ_systemd.py'
Sep 30 07:00:14 compute-0 sudo[162592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:14 compute-0 unix_chkpwd[162595]: password check failed for user (root)
Sep 30 07:00:14 compute-0 sshd-session[162517]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.244  user=root
Sep 30 07:00:14 compute-0 python3.9[162594]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 07:00:14 compute-0 systemd[1]: Reloading.
Sep 30 07:00:14 compute-0 systemd-rc-local-generator[162621]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 07:00:14 compute-0 systemd-sysv-generator[162626]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 07:00:14 compute-0 sudo[162592]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:15 compute-0 sudo[162703]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjeayzowsbzklpbzwffgvwcosiejyaqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215613.0113387-752-92441758749223/AnsiballZ_systemd.py'
Sep 30 07:00:15 compute-0 sudo[162703]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:15 compute-0 python3.9[162705]: ansible-systemd Invoked with state=restarted name=edpm_iscsid.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 07:00:15 compute-0 systemd[1]: Reloading.
Sep 30 07:00:15 compute-0 systemd-rc-local-generator[162735]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 07:00:15 compute-0 systemd-sysv-generator[162740]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 07:00:16 compute-0 systemd[1]: Starting iscsid container...
Sep 30 07:00:16 compute-0 systemd[1]: Started libcrun container.
Sep 30 07:00:16 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c660a60b0d484463b7837606adcd84b1244c33383842ea494c0e00f3e56a3d23/merged/etc/iscsi supports timestamps until 2038 (0x7fffffff)
Sep 30 07:00:16 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c660a60b0d484463b7837606adcd84b1244c33383842ea494c0e00f3e56a3d23/merged/etc/target supports timestamps until 2038 (0x7fffffff)
Sep 30 07:00:16 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c660a60b0d484463b7837606adcd84b1244c33383842ea494c0e00f3e56a3d23/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Sep 30 07:00:16 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b.
Sep 30 07:00:16 compute-0 podman[162746]: 2025-09-30 07:00:16.22956006 +0000 UTC m=+0.159682041 container init d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Sep 30 07:00:16 compute-0 iscsid[162762]: + sudo -E kolla_set_configs
Sep 30 07:00:16 compute-0 podman[162746]: 2025-09-30 07:00:16.268632322 +0000 UTC m=+0.198754363 container start d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Sep 30 07:00:16 compute-0 podman[162746]: iscsid
Sep 30 07:00:16 compute-0 sudo[162768]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Sep 30 07:00:16 compute-0 systemd[1]: Started iscsid container.
Sep 30 07:00:16 compute-0 systemd[1]: Created slice User Slice of UID 0.
Sep 30 07:00:16 compute-0 systemd[1]: Starting User Runtime Directory /run/user/0...
Sep 30 07:00:16 compute-0 sudo[162703]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:16 compute-0 systemd[1]: Finished User Runtime Directory /run/user/0.
Sep 30 07:00:16 compute-0 sshd-session[162517]: Failed password for root from 193.46.255.244 port 16192 ssh2
Sep 30 07:00:16 compute-0 systemd[1]: Starting User Manager for UID 0...
Sep 30 07:00:16 compute-0 systemd[162784]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Sep 30 07:00:16 compute-0 podman[162769]: 2025-09-30 07:00:16.390634022 +0000 UTC m=+0.098393503 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=starting, health_failing_streak=1, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.4)
Sep 30 07:00:16 compute-0 systemd[1]: d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b-59e873b746d500e0.service: Main process exited, code=exited, status=1/FAILURE
Sep 30 07:00:16 compute-0 systemd[1]: d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b-59e873b746d500e0.service: Failed with result 'exit-code'.
Sep 30 07:00:16 compute-0 systemd[162784]: Queued start job for default target Main User Target.
Sep 30 07:00:16 compute-0 systemd[162784]: Created slice User Application Slice.
Sep 30 07:00:16 compute-0 systemd[162784]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Sep 30 07:00:16 compute-0 systemd[162784]: Started Daily Cleanup of User's Temporary Directories.
Sep 30 07:00:16 compute-0 systemd[162784]: Reached target Paths.
Sep 30 07:00:16 compute-0 systemd[162784]: Reached target Timers.
Sep 30 07:00:16 compute-0 systemd[162784]: Starting D-Bus User Message Bus Socket...
Sep 30 07:00:16 compute-0 systemd[162784]: Starting Create User's Volatile Files and Directories...
Sep 30 07:00:16 compute-0 systemd[162784]: Listening on D-Bus User Message Bus Socket.
Sep 30 07:00:16 compute-0 systemd[162784]: Reached target Sockets.
Sep 30 07:00:16 compute-0 systemd[162784]: Finished Create User's Volatile Files and Directories.
Sep 30 07:00:16 compute-0 systemd[162784]: Reached target Basic System.
Sep 30 07:00:16 compute-0 systemd[162784]: Reached target Main User Target.
Sep 30 07:00:16 compute-0 systemd[162784]: Startup finished in 171ms.
Sep 30 07:00:16 compute-0 systemd[1]: Started User Manager for UID 0.
Sep 30 07:00:16 compute-0 systemd[1]: Started Session c3 of User root.
Sep 30 07:00:16 compute-0 sudo[162768]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Sep 30 07:00:16 compute-0 iscsid[162762]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Sep 30 07:00:16 compute-0 iscsid[162762]: INFO:__main__:Validating config file
Sep 30 07:00:16 compute-0 iscsid[162762]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Sep 30 07:00:16 compute-0 iscsid[162762]: INFO:__main__:Writing out command to execute
Sep 30 07:00:16 compute-0 sudo[162768]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:16 compute-0 systemd[1]: session-c3.scope: Deactivated successfully.
Sep 30 07:00:16 compute-0 iscsid[162762]: ++ cat /run_command
Sep 30 07:00:16 compute-0 iscsid[162762]: + CMD='/usr/sbin/iscsid -f'
Sep 30 07:00:16 compute-0 iscsid[162762]: + ARGS=
Sep 30 07:00:16 compute-0 iscsid[162762]: + sudo kolla_copy_cacerts
Sep 30 07:00:16 compute-0 unix_chkpwd[162886]: password check failed for user (root)
Sep 30 07:00:16 compute-0 sudo[162887]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Sep 30 07:00:16 compute-0 systemd[1]: Started Session c4 of User root.
Sep 30 07:00:16 compute-0 sudo[162887]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Sep 30 07:00:16 compute-0 sudo[162887]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:16 compute-0 systemd[1]: session-c4.scope: Deactivated successfully.
Sep 30 07:00:16 compute-0 iscsid[162762]: + [[ ! -n '' ]]
Sep 30 07:00:16 compute-0 iscsid[162762]: + . kolla_extend_start
Sep 30 07:00:16 compute-0 iscsid[162762]: ++ [[ ! -f /etc/iscsi/initiatorname.iscsi ]]
Sep 30 07:00:16 compute-0 iscsid[162762]: + echo 'Running command: '\''/usr/sbin/iscsid -f'\'''
Sep 30 07:00:16 compute-0 iscsid[162762]: Running command: '/usr/sbin/iscsid -f'
Sep 30 07:00:16 compute-0 iscsid[162762]: + umask 0022
Sep 30 07:00:16 compute-0 iscsid[162762]: + exec /usr/sbin/iscsid -f
Sep 30 07:00:16 compute-0 kernel: Loading iSCSI transport class v2.0-870.
Sep 30 07:00:17 compute-0 python3.9[162971]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.iscsid_restart_required follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 07:00:17 compute-0 sudo[163121]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzwiaxwcstbprxodiniiidarignmrsen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215617.562094-826-90998758108634/AnsiballZ_file.py'
Sep 30 07:00:17 compute-0 sudo[163121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:18 compute-0 python3.9[163123]: ansible-ansible.builtin.file Invoked with path=/etc/iscsi/.iscsid_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:00:18 compute-0 sudo[163121]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:19 compute-0 sudo[163273]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njjonwdokrizeflcqwexmbkbskvsafri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215618.6650097-848-206374596942505/AnsiballZ_service_facts.py'
Sep 30 07:00:19 compute-0 sudo[163273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:19 compute-0 sshd-session[162517]: Failed password for root from 193.46.255.244 port 16192 ssh2
Sep 30 07:00:19 compute-0 python3.9[163275]: ansible-ansible.builtin.service_facts Invoked
Sep 30 07:00:19 compute-0 network[163292]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Sep 30 07:00:19 compute-0 network[163293]: 'network-scripts' will be removed from distribution in near future.
Sep 30 07:00:19 compute-0 network[163294]: It is advised to switch to 'NetworkManager' instead for network management.
Sep 30 07:00:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:00:20.501 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:00:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:00:20.502 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:00:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:00:20.502 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:00:20 compute-0 unix_chkpwd[163323]: password check failed for user (root)
Sep 30 07:00:21 compute-0 podman[163349]: 2025-09-30 07:00:21.789811367 +0000 UTC m=+0.158499215 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Sep 30 07:00:22 compute-0 sshd-session[162517]: Failed password for root from 193.46.255.244 port 16192 ssh2
Sep 30 07:00:22 compute-0 sshd-session[162517]: Received disconnect from 193.46.255.244 port 16192:11:  [preauth]
Sep 30 07:00:22 compute-0 sshd-session[162517]: Disconnected from authenticating user root 193.46.255.244 port 16192 [preauth]
Sep 30 07:00:22 compute-0 sshd-session[162517]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.244  user=root
Sep 30 07:00:23 compute-0 sudo[163273]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:23 compute-0 unix_chkpwd[163471]: password check failed for user (root)
Sep 30 07:00:23 compute-0 sshd-session[163427]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.244  user=root
Sep 30 07:00:25 compute-0 podman[163472]: 2025-09-30 07:00:25.517104361 +0000 UTC m=+0.092971190 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250930)
Sep 30 07:00:26 compute-0 sshd-session[163427]: Failed password for root from 193.46.255.244 port 56742 ssh2
Sep 30 07:00:26 compute-0 systemd[1]: Stopping User Manager for UID 0...
Sep 30 07:00:26 compute-0 systemd[162784]: Activating special unit Exit the Session...
Sep 30 07:00:26 compute-0 systemd[162784]: Stopped target Main User Target.
Sep 30 07:00:26 compute-0 systemd[162784]: Stopped target Basic System.
Sep 30 07:00:26 compute-0 systemd[162784]: Stopped target Paths.
Sep 30 07:00:26 compute-0 systemd[162784]: Stopped target Sockets.
Sep 30 07:00:26 compute-0 systemd[162784]: Stopped target Timers.
Sep 30 07:00:26 compute-0 systemd[162784]: Stopped Daily Cleanup of User's Temporary Directories.
Sep 30 07:00:26 compute-0 systemd[162784]: Closed D-Bus User Message Bus Socket.
Sep 30 07:00:26 compute-0 systemd[162784]: Stopped Create User's Volatile Files and Directories.
Sep 30 07:00:26 compute-0 systemd[162784]: Removed slice User Application Slice.
Sep 30 07:00:26 compute-0 systemd[162784]: Reached target Shutdown.
Sep 30 07:00:26 compute-0 systemd[162784]: Finished Exit the Session.
Sep 30 07:00:26 compute-0 systemd[162784]: Reached target Exit the Session.
Sep 30 07:00:26 compute-0 systemd[1]: user@0.service: Deactivated successfully.
Sep 30 07:00:26 compute-0 systemd[1]: Stopped User Manager for UID 0.
Sep 30 07:00:26 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/0...
Sep 30 07:00:26 compute-0 systemd[1]: run-user-0.mount: Deactivated successfully.
Sep 30 07:00:26 compute-0 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Sep 30 07:00:26 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/0.
Sep 30 07:00:26 compute-0 systemd[1]: Removed slice User Slice of UID 0.
Sep 30 07:00:27 compute-0 unix_chkpwd[163493]: password check failed for user (root)
Sep 30 07:00:28 compute-0 sudo[163618]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyoiyldpnrjgyygoljfjpilsuwsgaafn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215628.0366292-868-97098443919493/AnsiballZ_file.py'
Sep 30 07:00:28 compute-0 sudo[163618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:28 compute-0 python3.9[163620]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Sep 30 07:00:28 compute-0 sudo[163618]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:29 compute-0 sudo[163770]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kuzzdcdryhzckdxuczncfxjcmmpypbfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215628.9158149-884-275096841766993/AnsiballZ_modprobe.py'
Sep 30 07:00:29 compute-0 sudo[163770]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:29 compute-0 python3.9[163772]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Sep 30 07:00:29 compute-0 sudo[163770]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:30 compute-0 sshd-session[163427]: Failed password for root from 193.46.255.244 port 56742 ssh2
Sep 30 07:00:30 compute-0 sudo[163926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bssoaorsdxotqdahjzvmwjhphlqdlmev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215630.067799-900-155609518827823/AnsiballZ_stat.py'
Sep 30 07:00:30 compute-0 sudo[163926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:30 compute-0 python3.9[163928]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 07:00:30 compute-0 sudo[163926]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:31 compute-0 sudo[164049]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxssftykdqccrzuzfhpzqwduusmijcou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215630.067799-900-155609518827823/AnsiballZ_copy.py'
Sep 30 07:00:31 compute-0 sudo[164049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:31 compute-0 python3.9[164051]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759215630.067799-900-155609518827823/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:00:31 compute-0 sudo[164049]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:32 compute-0 unix_chkpwd[164182]: password check failed for user (root)
Sep 30 07:00:32 compute-0 sudo[164202]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxagfavvahiaqpiuqbsueikodpjcgfmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215631.7788131-932-188225262189590/AnsiballZ_lineinfile.py'
Sep 30 07:00:32 compute-0 sudo[164202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:32 compute-0 python3.9[164204]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:00:32 compute-0 sudo[164202]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:33 compute-0 sudo[164354]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzyugcgdamicqrqwcytzysafouptncrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215632.6954577-948-31458925087866/AnsiballZ_systemd.py'
Sep 30 07:00:33 compute-0 sudo[164354]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:33 compute-0 python3.9[164356]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 07:00:33 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Sep 30 07:00:33 compute-0 systemd[1]: Stopped Load Kernel Modules.
Sep 30 07:00:33 compute-0 systemd[1]: Stopping Load Kernel Modules...
Sep 30 07:00:33 compute-0 systemd[1]: Starting Load Kernel Modules...
Sep 30 07:00:33 compute-0 systemd[1]: Finished Load Kernel Modules.
Sep 30 07:00:33 compute-0 sudo[164354]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:34 compute-0 sshd-session[163427]: Failed password for root from 193.46.255.244 port 56742 ssh2
Sep 30 07:00:34 compute-0 sshd-session[163427]: Received disconnect from 193.46.255.244 port 56742:11:  [preauth]
Sep 30 07:00:34 compute-0 sudo[164510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfeznurytezuaxknjbkyprtrlmgztpqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215633.8399417-964-47824052555479/AnsiballZ_file.py'
Sep 30 07:00:34 compute-0 sshd-session[163427]: Disconnected from authenticating user root 193.46.255.244 port 56742 [preauth]
Sep 30 07:00:34 compute-0 sshd-session[163427]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.244  user=root
Sep 30 07:00:34 compute-0 sudo[164510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:34 compute-0 python3.9[164512]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 07:00:34 compute-0 sudo[164510]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:35 compute-0 unix_chkpwd[164614]: password check failed for user (root)
Sep 30 07:00:35 compute-0 sshd-session[164513]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.244  user=root
Sep 30 07:00:35 compute-0 sudo[164665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qasdlkcmipnxhkmksckzmtxzwhurseme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215634.9291916-982-40769418676290/AnsiballZ_stat.py'
Sep 30 07:00:35 compute-0 sudo[164665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:35 compute-0 python3.9[164667]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 07:00:35 compute-0 sudo[164665]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:35 compute-0 systemd[1]: virtnodedevd.service: Deactivated successfully.
Sep 30 07:00:36 compute-0 sudo[164818]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgfmzrcczwunzbejflpcysxcaipmttit ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215635.8053353-1000-3593819042524/AnsiballZ_stat.py'
Sep 30 07:00:36 compute-0 sudo[164818]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:36 compute-0 python3.9[164820]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 07:00:36 compute-0 sudo[164818]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:37 compute-0 sudo[164970]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbuixuubderfvbdnescyhmqzymrnrqjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215636.686305-1016-244033931709734/AnsiballZ_stat.py'
Sep 30 07:00:37 compute-0 sudo[164970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:37 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Sep 30 07:00:37 compute-0 python3.9[164972]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 07:00:37 compute-0 sudo[164970]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:37 compute-0 sshd-session[164513]: Failed password for root from 193.46.255.244 port 23594 ssh2
Sep 30 07:00:37 compute-0 sudo[165094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cfmrrowxqiohmtaxxueqzrtesuxzoeyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215636.686305-1016-244033931709734/AnsiballZ_copy.py'
Sep 30 07:00:37 compute-0 sudo[165094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:38 compute-0 python3.9[165096]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759215636.686305-1016-244033931709734/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:00:38 compute-0 sudo[165094]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:38 compute-0 systemd[1]: virtqemud.service: Deactivated successfully.
Sep 30 07:00:38 compute-0 sudo[165247]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dokttqzpzaqrunoezplcvyqxiozrnyub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215638.297927-1046-199415112520333/AnsiballZ_command.py'
Sep 30 07:00:38 compute-0 sudo[165247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:39 compute-0 python3.9[165249]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 07:00:39 compute-0 sudo[165247]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:39 compute-0 unix_chkpwd[165251]: password check failed for user (root)
Sep 30 07:00:39 compute-0 sudo[165401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwgbjxfufsfotlvrxgizekardqwyizcu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215639.481201-1062-170269804016720/AnsiballZ_lineinfile.py'
Sep 30 07:00:39 compute-0 sudo[165401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:39 compute-0 systemd[1]: virtsecretd.service: Deactivated successfully.
Sep 30 07:00:40 compute-0 python3.9[165403]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:00:40 compute-0 sudo[165401]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:41 compute-0 sudo[165554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzitucnjipylkttdmtofxvbiohntfybv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215640.442564-1078-170487354723682/AnsiballZ_replace.py'
Sep 30 07:00:41 compute-0 sudo[165554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:41 compute-0 python3.9[165556]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:00:41 compute-0 sudo[165554]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:41 compute-0 sshd-session[164513]: Failed password for root from 193.46.255.244 port 23594 ssh2
Sep 30 07:00:41 compute-0 sudo[165706]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pfmvpsmdwymhymejbvztjstflrvbxnwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215641.5105252-1094-35335058136332/AnsiballZ_replace.py'
Sep 30 07:00:41 compute-0 sudo[165706]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:42 compute-0 python3.9[165708]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:00:42 compute-0 sudo[165706]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:42 compute-0 sudo[165858]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tcufeteikagzddxiynfemnwiiutsdcyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215642.5087378-1112-252781932124919/AnsiballZ_lineinfile.py'
Sep 30 07:00:42 compute-0 sudo[165858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:43 compute-0 python3.9[165860]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:00:43 compute-0 sudo[165858]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:43 compute-0 unix_chkpwd[165885]: password check failed for user (root)
Sep 30 07:00:43 compute-0 sudo[166011]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyabmzenqtbbysifgdfyieujyhpmhwpx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215643.5888033-1112-67147341921700/AnsiballZ_lineinfile.py'
Sep 30 07:00:43 compute-0 sudo[166011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:44 compute-0 python3.9[166013]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:00:44 compute-0 sudo[166011]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:44 compute-0 sudo[166163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgjkdwmluxznlviyichvxgmcjzfjtooz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215644.4161246-1112-132392732567780/AnsiballZ_lineinfile.py'
Sep 30 07:00:44 compute-0 sudo[166163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:44 compute-0 python3.9[166165]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:00:44 compute-0 sudo[166163]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:45 compute-0 sudo[166315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lnkhioapbvociazzikwuwcqehxdyycbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215645.1497457-1112-164725213418772/AnsiballZ_lineinfile.py'
Sep 30 07:00:45 compute-0 sudo[166315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:45 compute-0 python3.9[166317]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:00:45 compute-0 sudo[166315]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:45 compute-0 sshd-session[164513]: Failed password for root from 193.46.255.244 port 23594 ssh2
Sep 30 07:00:46 compute-0 sudo[166467]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqbhgbtdjtytoqlkuwccgcihclrfqhfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215645.985583-1170-99039193445746/AnsiballZ_stat.py'
Sep 30 07:00:46 compute-0 sudo[166467]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:46 compute-0 python3.9[166469]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 07:00:46 compute-0 sudo[166467]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:47 compute-0 sudo[166632]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ckgkkzwycfhfrgtpgzrxrgwmsqkrfcov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215646.7503092-1186-211053580192636/AnsiballZ_file.py'
Sep 30 07:00:47 compute-0 sudo[166632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:47 compute-0 podman[166595]: 2025-09-30 07:00:47.178149477 +0000 UTC m=+0.103157390 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 07:00:47 compute-0 python3.9[166636]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:00:47 compute-0 sudo[166632]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:47 compute-0 sshd-session[164513]: Received disconnect from 193.46.255.244 port 23594:11:  [preauth]
Sep 30 07:00:47 compute-0 sshd-session[164513]: Disconnected from authenticating user root 193.46.255.244 port 23594 [preauth]
Sep 30 07:00:47 compute-0 sshd-session[164513]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.244  user=root
Sep 30 07:00:48 compute-0 sudo[166793]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hsnbhtcdpaluxlqimdpmvyepwbgrduqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215647.6909285-1204-63494106084373/AnsiballZ_file.py'
Sep 30 07:00:48 compute-0 sudo[166793]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:48 compute-0 python3.9[166795]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 07:00:48 compute-0 sudo[166793]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:48 compute-0 sudo[166945]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhoctknkxnejmiizhzbnvjmercmfyagt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215648.5415769-1220-163592434377373/AnsiballZ_stat.py'
Sep 30 07:00:48 compute-0 sudo[166945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:49 compute-0 python3.9[166947]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 07:00:49 compute-0 sudo[166945]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:49 compute-0 sudo[167023]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oalemabavhiemkmfjiuceituiqbjbvcm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215648.5415769-1220-163592434377373/AnsiballZ_file.py'
Sep 30 07:00:49 compute-0 sudo[167023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:49 compute-0 python3.9[167025]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 07:00:49 compute-0 sudo[167023]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:50 compute-0 sudo[167175]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kyljghcrnqgssnyztwwbuceqctugzout ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215649.8632133-1220-142583755586612/AnsiballZ_stat.py'
Sep 30 07:00:50 compute-0 sudo[167175]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:50 compute-0 python3.9[167177]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 07:00:50 compute-0 sudo[167175]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:50 compute-0 sudo[167253]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmbylmifqpxjkczbwnypvockulrcemsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215649.8632133-1220-142583755586612/AnsiballZ_file.py'
Sep 30 07:00:50 compute-0 sudo[167253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:51 compute-0 python3.9[167255]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 07:00:51 compute-0 sudo[167253]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:51 compute-0 sudo[167405]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkajigpdwqmkvnrramlfsavctomudala ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215651.3319445-1266-183868205043390/AnsiballZ_file.py'
Sep 30 07:00:51 compute-0 sudo[167405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:52 compute-0 python3.9[167407]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:00:52 compute-0 sudo[167405]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:52 compute-0 podman[167484]: 2025-09-30 07:00:52.563103891 +0000 UTC m=+0.139733889 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Sep 30 07:00:52 compute-0 sudo[167581]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-piawlptzivtvdwoaskxrkhutuwdfkwyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215652.2901282-1282-149131797135642/AnsiballZ_stat.py'
Sep 30 07:00:52 compute-0 sudo[167581]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:52 compute-0 python3.9[167583]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 07:00:52 compute-0 sudo[167581]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:53 compute-0 sudo[167659]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-psjthcemmbcfqxwytqrhhgkkkuusnpiu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215652.2901282-1282-149131797135642/AnsiballZ_file.py'
Sep 30 07:00:53 compute-0 sudo[167659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:53 compute-0 python3.9[167661]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:00:53 compute-0 sudo[167659]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:54 compute-0 sudo[167811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jktvztjmppfjxsstpibvaybwgfpszmxi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215653.7677114-1306-177683311262586/AnsiballZ_stat.py'
Sep 30 07:00:54 compute-0 sudo[167811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:54 compute-0 python3.9[167813]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 07:00:54 compute-0 sudo[167811]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:54 compute-0 sudo[167889]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxtrbhefkcelbszsknqnvxqjuelmaftr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215653.7677114-1306-177683311262586/AnsiballZ_file.py'
Sep 30 07:00:54 compute-0 sudo[167889]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:54 compute-0 python3.9[167891]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:00:54 compute-0 sudo[167889]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:55 compute-0 sudo[168041]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yofcvanxzmawtadomllgvjhdnujchtys ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215655.1659796-1330-58842100352719/AnsiballZ_systemd.py'
Sep 30 07:00:55 compute-0 sudo[168041]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:55 compute-0 podman[168043]: 2025-09-30 07:00:55.712560861 +0000 UTC m=+0.088978293 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Sep 30 07:00:55 compute-0 python3.9[168044]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 07:00:55 compute-0 systemd[1]: Reloading.
Sep 30 07:00:56 compute-0 systemd-rc-local-generator[168084]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 07:00:56 compute-0 systemd-sysv-generator[168090]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 07:00:56 compute-0 sudo[168041]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:56 compute-0 sudo[168249]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-snyofffjruqncrypdemawgdzoklyufkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215656.558959-1346-187670516862144/AnsiballZ_stat.py'
Sep 30 07:00:56 compute-0 sudo[168249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:57 compute-0 python3.9[168251]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 07:00:57 compute-0 sudo[168249]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:57 compute-0 sudo[168327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjcgzbuxrpimiaklwucvuxcrmxubfwjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215656.558959-1346-187670516862144/AnsiballZ_file.py'
Sep 30 07:00:57 compute-0 sudo[168327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:57 compute-0 python3.9[168329]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:00:57 compute-0 sudo[168327]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:58 compute-0 sudo[168479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ledzyeloojkgysboezhidutokayogzdm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215657.9044116-1370-45276747167280/AnsiballZ_stat.py'
Sep 30 07:00:58 compute-0 sudo[168479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:58 compute-0 python3.9[168481]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 07:00:58 compute-0 sudo[168479]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:58 compute-0 sudo[168557]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ybqmlcrlqckmuehpqmujjishrcumnfmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215657.9044116-1370-45276747167280/AnsiballZ_file.py'
Sep 30 07:00:58 compute-0 sudo[168557]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:00:59 compute-0 python3.9[168559]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:00:59 compute-0 sudo[168557]: pam_unix(sudo:session): session closed for user root
Sep 30 07:00:59 compute-0 sudo[168709]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gisdafvzrwwubwmeajwqrvdsbfasunjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215659.371964-1394-116590928710458/AnsiballZ_systemd.py'
Sep 30 07:00:59 compute-0 sudo[168709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:01:00 compute-0 python3.9[168711]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 07:01:00 compute-0 systemd[1]: Reloading.
Sep 30 07:01:00 compute-0 systemd-sysv-generator[168743]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 07:01:00 compute-0 systemd-rc-local-generator[168740]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 07:01:00 compute-0 systemd[1]: Starting Create netns directory...
Sep 30 07:01:00 compute-0 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Sep 30 07:01:00 compute-0 systemd[1]: netns-placeholder.service: Deactivated successfully.
Sep 30 07:01:00 compute-0 systemd[1]: Finished Create netns directory.
Sep 30 07:01:00 compute-0 sudo[168709]: pam_unix(sudo:session): session closed for user root
Sep 30 07:01:01 compute-0 CROND[168825]: (root) CMD (run-parts /etc/cron.hourly)
Sep 30 07:01:01 compute-0 run-parts[168831]: (/etc/cron.hourly) starting 0anacron
Sep 30 07:01:01 compute-0 anacron[168842]: Anacron started on 2025-09-30
Sep 30 07:01:01 compute-0 anacron[168842]: Will run job `cron.daily' in 19 min.
Sep 30 07:01:01 compute-0 anacron[168842]: Will run job `cron.weekly' in 39 min.
Sep 30 07:01:01 compute-0 anacron[168842]: Will run job `cron.monthly' in 59 min.
Sep 30 07:01:01 compute-0 anacron[168842]: Jobs will be executed sequentially
Sep 30 07:01:01 compute-0 run-parts[168844]: (/etc/cron.hourly) finished 0anacron
Sep 30 07:01:01 compute-0 CROND[168821]: (root) CMDEND (run-parts /etc/cron.hourly)
Sep 30 07:01:01 compute-0 sudo[168918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-isbrbujobeymulvrgajrsxtspqfzloym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215660.9837794-1414-108282495178791/AnsiballZ_file.py'
Sep 30 07:01:01 compute-0 sudo[168918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:01:01 compute-0 python3.9[168920]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 07:01:01 compute-0 sudo[168918]: pam_unix(sudo:session): session closed for user root
Sep 30 07:01:02 compute-0 sudo[169070]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzntntahetysvlvablperyvybaiynmxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215661.8509364-1430-242672957677736/AnsiballZ_stat.py'
Sep 30 07:01:02 compute-0 sudo[169070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:01:02 compute-0 python3.9[169072]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 07:01:02 compute-0 sudo[169070]: pam_unix(sudo:session): session closed for user root
Sep 30 07:01:02 compute-0 sudo[169193]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iqkelosqrtxfdspbaexsaurrpsvgwjwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215661.8509364-1430-242672957677736/AnsiballZ_copy.py'
Sep 30 07:01:03 compute-0 sudo[169193]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:01:03 compute-0 python3.9[169195]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759215661.8509364-1430-242672957677736/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 07:01:03 compute-0 sudo[169193]: pam_unix(sudo:session): session closed for user root
Sep 30 07:01:04 compute-0 sudo[169345]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uiitqpunpnmehcfbxddifyvpzmnhjbty ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215663.7706335-1464-23241417014008/AnsiballZ_file.py'
Sep 30 07:01:04 compute-0 sudo[169345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:01:04 compute-0 python3.9[169347]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 07:01:04 compute-0 sudo[169345]: pam_unix(sudo:session): session closed for user root
Sep 30 07:01:05 compute-0 sudo[169497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxykarnxvornknfemamvoxbergplborx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215664.6844413-1480-99378772158326/AnsiballZ_stat.py'
Sep 30 07:01:05 compute-0 sudo[169497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:01:05 compute-0 python3.9[169499]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 07:01:05 compute-0 sudo[169497]: pam_unix(sudo:session): session closed for user root
Sep 30 07:01:05 compute-0 sudo[169620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-brieqvhxhuvzjzkmbrdirqglgpkniqgg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215664.6844413-1480-99378772158326/AnsiballZ_copy.py'
Sep 30 07:01:05 compute-0 sudo[169620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:01:06 compute-0 python3.9[169622]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759215664.6844413-1480-99378772158326/.source.json _original_basename=.pf8ri0a6 follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:01:06 compute-0 sudo[169620]: pam_unix(sudo:session): session closed for user root
Sep 30 07:01:06 compute-0 sudo[169772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzofkauarzltogjyfrrhihkcuxupqcoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215666.3296223-1510-132990768122785/AnsiballZ_file.py'
Sep 30 07:01:06 compute-0 sudo[169772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:01:06 compute-0 python3.9[169774]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:01:06 compute-0 sudo[169772]: pam_unix(sudo:session): session closed for user root
Sep 30 07:01:07 compute-0 sudo[169924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-skdkaldptakjvdcbvmzeabjtzckrkyxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215667.2438018-1526-93456848389576/AnsiballZ_stat.py'
Sep 30 07:01:07 compute-0 sudo[169924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:01:07 compute-0 sudo[169924]: pam_unix(sudo:session): session closed for user root
Sep 30 07:01:08 compute-0 sudo[170047]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikzozuclgvkopsbhqelikzqtziqrwydu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215667.2438018-1526-93456848389576/AnsiballZ_copy.py'
Sep 30 07:01:08 compute-0 sudo[170047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:01:08 compute-0 sudo[170047]: pam_unix(sudo:session): session closed for user root
Sep 30 07:01:09 compute-0 sudo[170199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxmvsttnsebumcrnulktskwncfxtwvzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215669.025967-1560-173728963144162/AnsiballZ_container_config_data.py'
Sep 30 07:01:09 compute-0 sudo[170199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:01:09 compute-0 python3.9[170201]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Sep 30 07:01:09 compute-0 sudo[170199]: pam_unix(sudo:session): session closed for user root
Sep 30 07:01:10 compute-0 sudo[170351]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zmpbjxfggazpbcrfnychsyylygqfysvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215670.014937-1578-153433637928453/AnsiballZ_container_config_hash.py'
Sep 30 07:01:10 compute-0 sudo[170351]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:01:10 compute-0 python3.9[170353]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Sep 30 07:01:10 compute-0 sudo[170351]: pam_unix(sudo:session): session closed for user root
Sep 30 07:01:11 compute-0 sudo[170503]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvnsecgjcbzfwdqvlvijyayaikozqcbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215671.0244982-1596-185005248214444/AnsiballZ_podman_container_info.py'
Sep 30 07:01:11 compute-0 sudo[170503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:01:11 compute-0 python3.9[170505]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Sep 30 07:01:11 compute-0 sudo[170503]: pam_unix(sudo:session): session closed for user root
Sep 30 07:01:13 compute-0 sudo[170681]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qyzggmrrtthxaekvuxpyjkspnjgxrvmo ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759215672.8247066-1622-196808567942885/AnsiballZ_edpm_container_manage.py'
Sep 30 07:01:13 compute-0 sudo[170681]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:01:13 compute-0 python3[170683]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Sep 30 07:01:13 compute-0 podman[170722]: 2025-09-30 07:01:13.814097205 +0000 UTC m=+0.069069746 container create 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Sep 30 07:01:13 compute-0 podman[170722]: 2025-09-30 07:01:13.773279912 +0000 UTC m=+0.028252443 image pull 0fb6856fe8f53101c9a68be625474646cbb6c5306dfa9570ef7defb7c487fcd5 38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest
Sep 30 07:01:13 compute-0 python3[170683]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z 38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest
Sep 30 07:01:14 compute-0 sudo[170681]: pam_unix(sudo:session): session closed for user root
Sep 30 07:01:14 compute-0 sudo[170911]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkvwrxovuengzmnjahtvviidlfygxnhx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215674.2722294-1638-87715426260184/AnsiballZ_stat.py'
Sep 30 07:01:14 compute-0 sudo[170911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:01:14 compute-0 python3.9[170913]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 07:01:14 compute-0 sudo[170911]: pam_unix(sudo:session): session closed for user root
Sep 30 07:01:15 compute-0 sudo[171065]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-moskshrfupjmzqifkeebmghzyixdcftv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215675.29827-1656-38583014530761/AnsiballZ_file.py'
Sep 30 07:01:15 compute-0 sudo[171065]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:01:15 compute-0 python3.9[171067]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:01:15 compute-0 sudo[171065]: pam_unix(sudo:session): session closed for user root
Sep 30 07:01:16 compute-0 sudo[171141]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxvyfzjfcduytqpdcawzrepeemcgxucf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215675.29827-1656-38583014530761/AnsiballZ_stat.py'
Sep 30 07:01:16 compute-0 sudo[171141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:01:16 compute-0 python3.9[171143]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 07:01:16 compute-0 sudo[171141]: pam_unix(sudo:session): session closed for user root
Sep 30 07:01:17 compute-0 sudo[171292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqifjmfeuqfvyuqyylpbazokyqxgtupt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215676.544907-1656-54772697060688/AnsiballZ_copy.py'
Sep 30 07:01:17 compute-0 sudo[171292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:01:17 compute-0 python3.9[171294]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759215676.544907-1656-54772697060688/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:01:17 compute-0 sudo[171292]: pam_unix(sudo:session): session closed for user root
Sep 30 07:01:17 compute-0 podman[171295]: 2025-09-30 07:01:17.538773505 +0000 UTC m=+0.105858004 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=iscsid, tcib_managed=true, container_name=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4)
Sep 30 07:01:17 compute-0 sudo[171387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkasgkngcphtgublrjxfqfewrbmaysht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215676.544907-1656-54772697060688/AnsiballZ_systemd.py'
Sep 30 07:01:17 compute-0 sudo[171387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:01:18 compute-0 python3.9[171389]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 07:01:18 compute-0 systemd[1]: Reloading.
Sep 30 07:01:18 compute-0 systemd-rc-local-generator[171413]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 07:01:18 compute-0 systemd-sysv-generator[171417]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 07:01:18 compute-0 sudo[171387]: pam_unix(sudo:session): session closed for user root
Sep 30 07:01:18 compute-0 sudo[171498]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nodotojbaonhnewuaxuwvsjaovrfbzxy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215676.544907-1656-54772697060688/AnsiballZ_systemd.py'
Sep 30 07:01:18 compute-0 sudo[171498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:01:19 compute-0 python3.9[171500]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 07:01:19 compute-0 systemd[1]: Reloading.
Sep 30 07:01:19 compute-0 systemd-sysv-generator[171534]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 07:01:19 compute-0 systemd-rc-local-generator[171530]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 07:01:19 compute-0 systemd[1]: Starting multipathd container...
Sep 30 07:01:19 compute-0 systemd[1]: Started libcrun container.
Sep 30 07:01:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/00df26fb8f3b8c0a26794a2b0337fa2987c03e5e2b464793f43d53a273e8e5ae/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Sep 30 07:01:19 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/00df26fb8f3b8c0a26794a2b0337fa2987c03e5e2b464793f43d53a273e8e5ae/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Sep 30 07:01:19 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9.
Sep 30 07:01:19 compute-0 podman[171541]: 2025-09-30 07:01:19.85135518 +0000 UTC m=+0.180291333 container init 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, managed_by=edpm_ansible, container_name=multipathd, tcib_build_tag=watcher_latest, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 07:01:19 compute-0 multipathd[171557]: + sudo -E kolla_set_configs
Sep 30 07:01:19 compute-0 podman[171541]: 2025-09-30 07:01:19.89449849 +0000 UTC m=+0.223434663 container start 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 07:01:19 compute-0 sudo[171563]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Sep 30 07:01:19 compute-0 podman[171541]: multipathd
Sep 30 07:01:19 compute-0 sudo[171563]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Sep 30 07:01:19 compute-0 systemd[1]: Started multipathd container.
Sep 30 07:01:19 compute-0 sudo[171498]: pam_unix(sudo:session): session closed for user root
Sep 30 07:01:19 compute-0 multipathd[171557]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Sep 30 07:01:19 compute-0 multipathd[171557]: INFO:__main__:Validating config file
Sep 30 07:01:19 compute-0 multipathd[171557]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Sep 30 07:01:19 compute-0 multipathd[171557]: INFO:__main__:Writing out command to execute
Sep 30 07:01:19 compute-0 sudo[171563]: pam_unix(sudo:session): session closed for user root
Sep 30 07:01:19 compute-0 multipathd[171557]: ++ cat /run_command
Sep 30 07:01:19 compute-0 multipathd[171557]: + CMD='/usr/sbin/multipathd -d'
Sep 30 07:01:19 compute-0 multipathd[171557]: + ARGS=
Sep 30 07:01:19 compute-0 multipathd[171557]: + sudo kolla_copy_cacerts
Sep 30 07:01:20 compute-0 sudo[171588]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Sep 30 07:01:20 compute-0 sudo[171588]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Sep 30 07:01:20 compute-0 sudo[171588]: pam_unix(sudo:session): session closed for user root
Sep 30 07:01:20 compute-0 multipathd[171557]: + [[ ! -n '' ]]
Sep 30 07:01:20 compute-0 multipathd[171557]: + . kolla_extend_start
Sep 30 07:01:20 compute-0 multipathd[171557]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Sep 30 07:01:20 compute-0 multipathd[171557]: Running command: '/usr/sbin/multipathd -d'
Sep 30 07:01:20 compute-0 multipathd[171557]: + umask 0022
Sep 30 07:01:20 compute-0 multipathd[171557]: + exec /usr/sbin/multipathd -d
Sep 30 07:01:20 compute-0 podman[171564]: 2025-09-30 07:01:20.024038473 +0000 UTC m=+0.111261659 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 07:01:20 compute-0 systemd[1]: 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9-762d535ddcb48b3e.service: Main process exited, code=exited, status=1/FAILURE
Sep 30 07:01:20 compute-0 systemd[1]: 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9-762d535ddcb48b3e.service: Failed with result 'exit-code'.
Sep 30 07:01:20 compute-0 multipathd[171557]: 3537.831306 | multipathd v0.9.9: start up
Sep 30 07:01:20 compute-0 multipathd[171557]: 3537.842973 | reconfigure: setting up paths and maps
Sep 30 07:01:20 compute-0 multipathd[171557]: 3537.845202 | _check_bindings_file: failed to read header from /etc/multipath/bindings
Sep 30 07:01:20 compute-0 multipathd[171557]: 3537.846884 | updated bindings file /etc/multipath/bindings
Sep 30 07:01:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:01:20.509 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:01:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:01:20.510 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:01:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:01:20.510 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:01:20 compute-0 python3.9[171747]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 07:01:21 compute-0 sudo[171899]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtxkhcinhetsrgagxratxeancdrurdcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215681.0412242-1728-200448925159651/AnsiballZ_command.py'
Sep 30 07:01:21 compute-0 sudo[171899]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:01:21 compute-0 python3.9[171901]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 07:01:21 compute-0 sudo[171899]: pam_unix(sudo:session): session closed for user root
Sep 30 07:01:22 compute-0 sudo[172063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qldipcxexisyvvyiwtuioifladzcguse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215682.0511522-1744-114344825144412/AnsiballZ_systemd.py'
Sep 30 07:01:22 compute-0 sudo[172063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:01:22 compute-0 python3.9[172065]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 07:01:22 compute-0 systemd[1]: Stopping multipathd container...
Sep 30 07:01:22 compute-0 multipathd[171557]: 3540.681544 | multipathd: shut down
Sep 30 07:01:22 compute-0 systemd[1]: libpod-4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9.scope: Deactivated successfully.
Sep 30 07:01:22 compute-0 podman[172075]: 2025-09-30 07:01:22.942553292 +0000 UTC m=+0.089713119 container died 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 07:01:22 compute-0 systemd[1]: 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9-762d535ddcb48b3e.timer: Deactivated successfully.
Sep 30 07:01:22 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9.
Sep 30 07:01:22 compute-0 podman[172067]: 2025-09-30 07:01:22.967195671 +0000 UTC m=+0.147001556 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 07:01:22 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9-userdata-shm.mount: Deactivated successfully.
Sep 30 07:01:23 compute-0 systemd[1]: var-lib-containers-storage-overlay-00df26fb8f3b8c0a26794a2b0337fa2987c03e5e2b464793f43d53a273e8e5ae-merged.mount: Deactivated successfully.
Sep 30 07:01:23 compute-0 podman[172075]: 2025-09-30 07:01:23.00683765 +0000 UTC m=+0.153997457 container cleanup 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_build_tag=watcher_latest)
Sep 30 07:01:23 compute-0 podman[172075]: multipathd
Sep 30 07:01:23 compute-0 podman[172119]: multipathd
Sep 30 07:01:23 compute-0 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Sep 30 07:01:23 compute-0 systemd[1]: Stopped multipathd container.
Sep 30 07:01:23 compute-0 systemd[1]: Starting multipathd container...
Sep 30 07:01:23 compute-0 systemd[1]: Started libcrun container.
Sep 30 07:01:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/00df26fb8f3b8c0a26794a2b0337fa2987c03e5e2b464793f43d53a273e8e5ae/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Sep 30 07:01:23 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/00df26fb8f3b8c0a26794a2b0337fa2987c03e5e2b464793f43d53a273e8e5ae/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Sep 30 07:01:23 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9.
Sep 30 07:01:23 compute-0 podman[172132]: 2025-09-30 07:01:23.331776219 +0000 UTC m=+0.179689865 container init 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4)
Sep 30 07:01:23 compute-0 multipathd[172147]: + sudo -E kolla_set_configs
Sep 30 07:01:23 compute-0 sudo[172153]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Sep 30 07:01:23 compute-0 podman[172132]: 2025-09-30 07:01:23.373630312 +0000 UTC m=+0.221543908 container start 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_managed=true)
Sep 30 07:01:23 compute-0 sudo[172153]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Sep 30 07:01:23 compute-0 podman[172132]: multipathd
Sep 30 07:01:23 compute-0 systemd[1]: Started multipathd container.
Sep 30 07:01:23 compute-0 sudo[172063]: pam_unix(sudo:session): session closed for user root
Sep 30 07:01:23 compute-0 multipathd[172147]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Sep 30 07:01:23 compute-0 multipathd[172147]: INFO:__main__:Validating config file
Sep 30 07:01:23 compute-0 multipathd[172147]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Sep 30 07:01:23 compute-0 multipathd[172147]: INFO:__main__:Writing out command to execute
Sep 30 07:01:23 compute-0 sudo[172153]: pam_unix(sudo:session): session closed for user root
Sep 30 07:01:23 compute-0 multipathd[172147]: ++ cat /run_command
Sep 30 07:01:23 compute-0 multipathd[172147]: + CMD='/usr/sbin/multipathd -d'
Sep 30 07:01:23 compute-0 multipathd[172147]: + ARGS=
Sep 30 07:01:23 compute-0 multipathd[172147]: + sudo kolla_copy_cacerts
Sep 30 07:01:23 compute-0 sudo[172179]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Sep 30 07:01:23 compute-0 sudo[172179]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Sep 30 07:01:23 compute-0 sudo[172179]: pam_unix(sudo:session): session closed for user root
Sep 30 07:01:23 compute-0 multipathd[172147]: + [[ ! -n '' ]]
Sep 30 07:01:23 compute-0 multipathd[172147]: + . kolla_extend_start
Sep 30 07:01:23 compute-0 multipathd[172147]: Running command: '/usr/sbin/multipathd -d'
Sep 30 07:01:23 compute-0 multipathd[172147]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Sep 30 07:01:23 compute-0 multipathd[172147]: + umask 0022
Sep 30 07:01:23 compute-0 multipathd[172147]: + exec /usr/sbin/multipathd -d
Sep 30 07:01:23 compute-0 podman[172154]: 2025-09-30 07:01:23.504632667 +0000 UTC m=+0.111852746 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Sep 30 07:01:23 compute-0 systemd[1]: 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9-31a7357fea4bf21c.service: Main process exited, code=exited, status=1/FAILURE
Sep 30 07:01:23 compute-0 systemd[1]: 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9-31a7357fea4bf21c.service: Failed with result 'exit-code'.
Sep 30 07:01:23 compute-0 multipathd[172147]: 3541.298015 | multipathd v0.9.9: start up
Sep 30 07:01:23 compute-0 multipathd[172147]: 3541.310517 | reconfigure: setting up paths and maps
Sep 30 07:01:24 compute-0 sudo[172337]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqfgkfnywtmoigtbxlscfqkpsmqyddyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215683.672841-1760-269997192432454/AnsiballZ_file.py'
Sep 30 07:01:24 compute-0 sudo[172337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:01:24 compute-0 python3.9[172339]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:01:24 compute-0 sudo[172337]: pam_unix(sudo:session): session closed for user root
Sep 30 07:01:25 compute-0 sudo[172489]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgnsuborspeonwpkvvrcsmvuycbtmwxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215684.9853992-1784-242708490360341/AnsiballZ_file.py'
Sep 30 07:01:25 compute-0 sudo[172489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:01:25 compute-0 python3.9[172491]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Sep 30 07:01:25 compute-0 sudo[172489]: pam_unix(sudo:session): session closed for user root
Sep 30 07:01:26 compute-0 sudo[172651]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgezzuxtxgahipcfggjrrqohxfkuratr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215685.87002-1800-94951253926739/AnsiballZ_modprobe.py'
Sep 30 07:01:26 compute-0 sudo[172651]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:01:26 compute-0 podman[172615]: 2025-09-30 07:01:26.309859071 +0000 UTC m=+0.093550050 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Sep 30 07:01:26 compute-0 python3.9[172657]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Sep 30 07:01:26 compute-0 kernel: Key type psk registered
Sep 30 07:01:26 compute-0 sudo[172651]: pam_unix(sudo:session): session closed for user root
Sep 30 07:01:27 compute-0 sudo[172819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-joebazfrqwqdhdwdypzgnwzjpyohbhcq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215686.8505893-1816-151298962737430/AnsiballZ_stat.py'
Sep 30 07:01:27 compute-0 sudo[172819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:01:27 compute-0 python3.9[172821]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 07:01:27 compute-0 sudo[172819]: pam_unix(sudo:session): session closed for user root
Sep 30 07:01:28 compute-0 sudo[172942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqgjzoyldtqtswzxoicvwlbyecqozvhv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215686.8505893-1816-151298962737430/AnsiballZ_copy.py'
Sep 30 07:01:28 compute-0 sudo[172942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:01:28 compute-0 python3.9[172944]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759215686.8505893-1816-151298962737430/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:01:28 compute-0 sudo[172942]: pam_unix(sudo:session): session closed for user root
Sep 30 07:01:29 compute-0 sudo[173094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ojfnalixuofqaiuduoxpsrqbkkkzuiaw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215688.778823-1848-6603666169112/AnsiballZ_lineinfile.py'
Sep 30 07:01:29 compute-0 sudo[173094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:01:29 compute-0 python3.9[173096]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:01:29 compute-0 sudo[173094]: pam_unix(sudo:session): session closed for user root
Sep 30 07:01:30 compute-0 sudo[173246]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhpkbllntjpmtpwddtdydmprazhutujp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215689.6977139-1864-119763261377940/AnsiballZ_systemd.py'
Sep 30 07:01:30 compute-0 sudo[173246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:01:30 compute-0 python3.9[173248]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 07:01:30 compute-0 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Sep 30 07:01:30 compute-0 systemd[1]: Stopped Load Kernel Modules.
Sep 30 07:01:30 compute-0 systemd[1]: Stopping Load Kernel Modules...
Sep 30 07:01:30 compute-0 systemd[1]: Starting Load Kernel Modules...
Sep 30 07:01:30 compute-0 systemd[1]: Finished Load Kernel Modules.
Sep 30 07:01:30 compute-0 sudo[173246]: pam_unix(sudo:session): session closed for user root
Sep 30 07:01:31 compute-0 sudo[173402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjaixzxskbudcmjhwcwcxiyfjstkfpbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215690.9128215-1880-253316048522604/AnsiballZ_setup.py'
Sep 30 07:01:31 compute-0 sudo[173402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:01:31 compute-0 python3.9[173404]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Sep 30 07:01:31 compute-0 sudo[173402]: pam_unix(sudo:session): session closed for user root
Sep 30 07:01:32 compute-0 sudo[173486]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-btykunoqstrddgiztpbabdtjzbmmntss ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215690.9128215-1880-253316048522604/AnsiballZ_dnf.py'
Sep 30 07:01:32 compute-0 sudo[173486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:01:32 compute-0 python3.9[173488]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Sep 30 07:01:38 compute-0 systemd[1]: Reloading.
Sep 30 07:01:39 compute-0 systemd-rc-local-generator[173519]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 07:01:39 compute-0 systemd-sysv-generator[173523]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 07:01:39 compute-0 systemd[1]: Reloading.
Sep 30 07:01:39 compute-0 systemd-rc-local-generator[173555]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 07:01:39 compute-0 systemd-sysv-generator[173559]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 07:01:39 compute-0 systemd-logind[824]: Watching system buttons on /dev/input/event0 (Power Button)
Sep 30 07:01:39 compute-0 systemd-logind[824]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Sep 30 07:01:40 compute-0 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Sep 30 07:01:40 compute-0 systemd[1]: Starting man-db-cache-update.service...
Sep 30 07:01:40 compute-0 systemd[1]: Reloading.
Sep 30 07:01:40 compute-0 systemd-rc-local-generator[173649]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 07:01:40 compute-0 systemd-sysv-generator[173653]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 07:01:40 compute-0 systemd[1]: Queuing reload/restart jobs for marked units…
Sep 30 07:01:41 compute-0 sudo[173486]: pam_unix(sudo:session): session closed for user root
Sep 30 07:01:41 compute-0 sudo[174870]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnqtxpewnnozkayndyetewwzuklufnfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215701.370664-1904-243195973280239/AnsiballZ_file.py'
Sep 30 07:01:41 compute-0 sudo[174870]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:01:41 compute-0 python3.9[174895]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.iscsid_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:01:42 compute-0 sudo[174870]: pam_unix(sudo:session): session closed for user root
Sep 30 07:01:42 compute-0 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Sep 30 07:01:42 compute-0 systemd[1]: Finished man-db-cache-update.service.
Sep 30 07:01:42 compute-0 systemd[1]: man-db-cache-update.service: Consumed 2.272s CPU time.
Sep 30 07:01:42 compute-0 systemd[1]: run-r78eff7c65c4845baba84b55829d4b7eb.service: Deactivated successfully.
Sep 30 07:01:42 compute-0 python3.9[175090]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 07:01:43 compute-0 sudo[175244]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifpuafjvjoaepcimwnftzzhoknxduasv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215703.3345132-1939-176001320384869/AnsiballZ_file.py'
Sep 30 07:01:43 compute-0 sudo[175244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:01:43 compute-0 python3.9[175246]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:01:43 compute-0 sudo[175244]: pam_unix(sudo:session): session closed for user root
Sep 30 07:01:45 compute-0 sudo[175396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbglhktndpeydsuuodiieewmmfeyekuu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215704.4212558-1961-157844135258102/AnsiballZ_systemd_service.py'
Sep 30 07:01:45 compute-0 sudo[175396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:01:45 compute-0 python3.9[175398]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 07:01:45 compute-0 systemd[1]: Reloading.
Sep 30 07:01:45 compute-0 systemd-rc-local-generator[175420]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 07:01:45 compute-0 systemd-sysv-generator[175426]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 07:01:45 compute-0 sudo[175396]: pam_unix(sudo:session): session closed for user root
Sep 30 07:01:46 compute-0 python3.9[175582]: ansible-ansible.builtin.service_facts Invoked
Sep 30 07:01:46 compute-0 network[175599]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Sep 30 07:01:46 compute-0 network[175600]: 'network-scripts' will be removed from distribution in near future.
Sep 30 07:01:46 compute-0 network[175601]: It is advised to switch to 'NetworkManager' instead for network management.
Sep 30 07:01:47 compute-0 podman[175608]: 2025-09-30 07:01:47.818668168 +0000 UTC m=+0.106086560 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, container_name=iscsid, tcib_managed=true, config_id=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 07:01:53 compute-0 sudo[175914]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbjpgxigubaigeglgovtljvkmjzfvann ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215712.7269552-1999-190079400928895/AnsiballZ_systemd_service.py'
Sep 30 07:01:53 compute-0 sudo[175914]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:01:53 compute-0 podman[175870]: 2025-09-30 07:01:53.140345337 +0000 UTC m=+0.128049891 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Sep 30 07:01:53 compute-0 python3.9[175920]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 07:01:53 compute-0 sudo[175914]: pam_unix(sudo:session): session closed for user root
Sep 30 07:01:54 compute-0 sudo[176083]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkfedbzekeqtkpasnrxavdxxnjnwsheq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215713.5895982-1999-1970716629714/AnsiballZ_systemd_service.py'
Sep 30 07:01:54 compute-0 sudo[176083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:01:54 compute-0 podman[176048]: 2025-09-30 07:01:54.028172564 +0000 UTC m=+0.093619242 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team)
Sep 30 07:01:54 compute-0 python3.9[176090]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 07:01:55 compute-0 sudo[176083]: pam_unix(sudo:session): session closed for user root
Sep 30 07:01:55 compute-0 sudo[176247]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ucldhdxxxesfizsjuocshoqlezcyurya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215715.5828419-1999-201178735742243/AnsiballZ_systemd_service.py'
Sep 30 07:01:55 compute-0 sudo[176247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:01:56 compute-0 python3.9[176249]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 07:01:56 compute-0 sudo[176247]: pam_unix(sudo:session): session closed for user root
Sep 30 07:01:56 compute-0 podman[176251]: 2025-09-30 07:01:56.402324168 +0000 UTC m=+0.052887431 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 07:01:56 compute-0 sudo[176419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfnduyhogrkssgqqnllzvczixeadewxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215716.4888144-1999-56995562048209/AnsiballZ_systemd_service.py'
Sep 30 07:01:56 compute-0 sudo[176419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:01:57 compute-0 python3.9[176421]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 07:01:57 compute-0 sudo[176419]: pam_unix(sudo:session): session closed for user root
Sep 30 07:01:57 compute-0 sudo[176572]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnhhwkpadfphsiphzacjbzejpaukjwzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215717.3100426-1999-36291543863786/AnsiballZ_systemd_service.py'
Sep 30 07:01:57 compute-0 sudo[176572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:01:58 compute-0 python3.9[176574]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 07:01:58 compute-0 sudo[176572]: pam_unix(sudo:session): session closed for user root
Sep 30 07:01:58 compute-0 sudo[176725]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lxcfcqtzabvhmoywxyhkczuvottabian ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215718.3005238-1999-61444526723345/AnsiballZ_systemd_service.py'
Sep 30 07:01:58 compute-0 sudo[176725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:01:59 compute-0 python3.9[176727]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 07:01:59 compute-0 sudo[176725]: pam_unix(sudo:session): session closed for user root
Sep 30 07:01:59 compute-0 sudo[176878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zahksqdcjhmjhivzxuutcjwyrjyunkzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215719.2922459-1999-37920427206887/AnsiballZ_systemd_service.py'
Sep 30 07:01:59 compute-0 sudo[176878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:02:00 compute-0 python3.9[176880]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 07:02:00 compute-0 sudo[176878]: pam_unix(sudo:session): session closed for user root
Sep 30 07:02:00 compute-0 sudo[177031]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkzbqhrnokfswulantgrcyomzhznjjea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215720.2973278-1999-214490424763145/AnsiballZ_systemd_service.py'
Sep 30 07:02:00 compute-0 sudo[177031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:02:01 compute-0 python3.9[177033]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 07:02:01 compute-0 sudo[177031]: pam_unix(sudo:session): session closed for user root
Sep 30 07:02:01 compute-0 sudo[177184]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iskzbdgopsvjacplcicusekrnpjkzbyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215721.522158-2117-270179314797851/AnsiballZ_file.py'
Sep 30 07:02:01 compute-0 sudo[177184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:02:02 compute-0 python3.9[177186]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:02:02 compute-0 sudo[177184]: pam_unix(sudo:session): session closed for user root
Sep 30 07:02:02 compute-0 sudo[177336]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmbgcoisfljowhvluiomnlaqehuqhrsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215722.250249-2117-224764411775263/AnsiballZ_file.py'
Sep 30 07:02:02 compute-0 sudo[177336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:02:02 compute-0 python3.9[177338]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:02:02 compute-0 sudo[177336]: pam_unix(sudo:session): session closed for user root
Sep 30 07:02:03 compute-0 sudo[177488]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pplqbzchiztbhppetyiqddjkoqruhmxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215723.0687578-2117-262283046836800/AnsiballZ_file.py'
Sep 30 07:02:03 compute-0 sudo[177488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:02:03 compute-0 python3.9[177490]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:02:03 compute-0 sudo[177488]: pam_unix(sudo:session): session closed for user root
Sep 30 07:02:04 compute-0 sudo[177640]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hulaiguabqlpbxhvugxpepelzbnvssfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215723.8928702-2117-113242636752377/AnsiballZ_file.py'
Sep 30 07:02:04 compute-0 sudo[177640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:02:04 compute-0 python3.9[177642]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:02:04 compute-0 sudo[177640]: pam_unix(sudo:session): session closed for user root
Sep 30 07:02:05 compute-0 sudo[177792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kurratjzlqsefelhatctulmmskmqotkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215724.7185147-2117-174355772902182/AnsiballZ_file.py'
Sep 30 07:02:05 compute-0 sudo[177792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:02:05 compute-0 python3.9[177794]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:02:05 compute-0 sudo[177792]: pam_unix(sudo:session): session closed for user root
Sep 30 07:02:05 compute-0 sudo[177944]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-msywzajqtfultrsxlgbkswcnklkrnlub ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215725.5701764-2117-102964558078909/AnsiballZ_file.py'
Sep 30 07:02:05 compute-0 sudo[177944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:02:06 compute-0 python3.9[177946]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:02:06 compute-0 sudo[177944]: pam_unix(sudo:session): session closed for user root
Sep 30 07:02:06 compute-0 sudo[178096]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awtdnaldlkkgcvlvgwfxwmmddxyuhrws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215726.3620958-2117-88527489809125/AnsiballZ_file.py'
Sep 30 07:02:06 compute-0 sudo[178096]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:02:06 compute-0 python3.9[178098]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:02:06 compute-0 sudo[178096]: pam_unix(sudo:session): session closed for user root
Sep 30 07:02:07 compute-0 sudo[178248]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmdzqlfainjnpxatkyxqduhenarhibwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215727.1255345-2117-45211878848310/AnsiballZ_file.py'
Sep 30 07:02:07 compute-0 sudo[178248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:02:07 compute-0 python3.9[178250]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:02:07 compute-0 sudo[178248]: pam_unix(sudo:session): session closed for user root
Sep 30 07:02:08 compute-0 sudo[178400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jydfyexuvgibgyujluamkqkwkrcecimb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215727.941805-2231-32980184716968/AnsiballZ_file.py'
Sep 30 07:02:08 compute-0 sudo[178400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:02:08 compute-0 python3.9[178402]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:02:08 compute-0 sudo[178400]: pam_unix(sudo:session): session closed for user root
Sep 30 07:02:09 compute-0 sudo[178552]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-isoafijtitdfkscazcukobnkmpfifqqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215728.727178-2231-83193828654222/AnsiballZ_file.py'
Sep 30 07:02:09 compute-0 sudo[178552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:02:09 compute-0 python3.9[178554]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:02:09 compute-0 sudo[178552]: pam_unix(sudo:session): session closed for user root
Sep 30 07:02:09 compute-0 sudo[178704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbqxjstexmtkrntqhjlbemswuwgergjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215729.4474502-2231-31616549262466/AnsiballZ_file.py'
Sep 30 07:02:09 compute-0 sudo[178704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:02:10 compute-0 python3.9[178706]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:02:10 compute-0 sudo[178704]: pam_unix(sudo:session): session closed for user root
Sep 30 07:02:10 compute-0 sudo[178856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzapnqygxssabybiayfonejcfyymtall ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215730.3316581-2231-20573112102211/AnsiballZ_file.py'
Sep 30 07:02:10 compute-0 sudo[178856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:02:10 compute-0 python3.9[178858]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:02:10 compute-0 sudo[178856]: pam_unix(sudo:session): session closed for user root
Sep 30 07:02:11 compute-0 sudo[179008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eugnmtfkaditvmflflmnnjnbgznyrwhp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215731.132807-2231-153792222268342/AnsiballZ_file.py'
Sep 30 07:02:11 compute-0 sudo[179008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:02:11 compute-0 python3.9[179010]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:02:11 compute-0 sudo[179008]: pam_unix(sudo:session): session closed for user root
Sep 30 07:02:12 compute-0 sudo[179160]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odpchgjfmagomtxltuliuamtqdjimvnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215731.936315-2231-208648185716172/AnsiballZ_file.py'
Sep 30 07:02:12 compute-0 sudo[179160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:02:12 compute-0 python3.9[179162]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:02:12 compute-0 sudo[179160]: pam_unix(sudo:session): session closed for user root
Sep 30 07:02:13 compute-0 sudo[179312]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-occsedpfudrphtbqpjstvsehyffeqfch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215732.6618578-2231-27794286776242/AnsiballZ_file.py'
Sep 30 07:02:13 compute-0 sudo[179312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:02:13 compute-0 python3.9[179314]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:02:13 compute-0 sudo[179312]: pam_unix(sudo:session): session closed for user root
Sep 30 07:02:13 compute-0 sudo[179464]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iefeyabpvwidzxiydburxcencjmrbrfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215733.453921-2231-114908623816266/AnsiballZ_file.py'
Sep 30 07:02:13 compute-0 sudo[179464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:02:14 compute-0 python3.9[179466]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:02:14 compute-0 sudo[179464]: pam_unix(sudo:session): session closed for user root
Sep 30 07:02:14 compute-0 sudo[179616]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-canfhyuwmplilrkblxpbmryrrohrqcjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215734.437704-2347-227639543177331/AnsiballZ_command.py'
Sep 30 07:02:14 compute-0 sudo[179616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:02:15 compute-0 python3.9[179618]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 07:02:15 compute-0 sudo[179616]: pam_unix(sudo:session): session closed for user root
Sep 30 07:02:16 compute-0 python3.9[179770]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Sep 30 07:02:16 compute-0 sudo[179920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejxcikjebzqvrhempwbjtzyaqededjqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215736.4026568-2383-94236694508694/AnsiballZ_systemd_service.py'
Sep 30 07:02:16 compute-0 sudo[179920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:02:17 compute-0 python3.9[179922]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 07:02:17 compute-0 systemd[1]: Reloading.
Sep 30 07:02:17 compute-0 systemd-rc-local-generator[179951]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 07:02:17 compute-0 systemd-sysv-generator[179954]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 07:02:17 compute-0 sudo[179920]: pam_unix(sudo:session): session closed for user root
Sep 30 07:02:18 compute-0 sudo[180123]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pxcjyyhynlgvrfjrloendqiwgpyungfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215737.8251538-2399-34600543918261/AnsiballZ_command.py'
Sep 30 07:02:18 compute-0 podman[180082]: 2025-09-30 07:02:18.268296793 +0000 UTC m=+0.098001047 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.build-date=20250930, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Sep 30 07:02:18 compute-0 sudo[180123]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:02:18 compute-0 python3.9[180128]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 07:02:18 compute-0 sudo[180123]: pam_unix(sudo:session): session closed for user root
Sep 30 07:02:19 compute-0 sudo[180280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zwsmmibszhbmmmfsspgftfqrdrtofqbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215738.7023723-2399-217744587263404/AnsiballZ_command.py'
Sep 30 07:02:19 compute-0 sudo[180280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:02:19 compute-0 python3.9[180282]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 07:02:19 compute-0 sudo[180280]: pam_unix(sudo:session): session closed for user root
Sep 30 07:02:19 compute-0 sudo[180433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfdsvnpzvdgsteliefsdpkmmijxyeojh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215739.5968175-2399-58775156614801/AnsiballZ_command.py'
Sep 30 07:02:19 compute-0 sudo[180433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:02:20 compute-0 python3.9[180435]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 07:02:20 compute-0 sudo[180433]: pam_unix(sudo:session): session closed for user root
Sep 30 07:02:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:02:20.512 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:02:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:02:20.513 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:02:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:02:20.513 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:02:20 compute-0 sudo[180587]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pucghxkiufpjyewwwfpxpunbgrgrniqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215740.3662083-2399-204495174989359/AnsiballZ_command.py'
Sep 30 07:02:20 compute-0 sudo[180587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:02:21 compute-0 python3.9[180589]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 07:02:21 compute-0 sudo[180587]: pam_unix(sudo:session): session closed for user root
Sep 30 07:02:21 compute-0 sudo[180740]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sebtbffdppbowsoyuynifuiqftchrglz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215741.335213-2399-31960004436517/AnsiballZ_command.py'
Sep 30 07:02:21 compute-0 sudo[180740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:02:21 compute-0 python3.9[180742]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 07:02:21 compute-0 sudo[180740]: pam_unix(sudo:session): session closed for user root
Sep 30 07:02:22 compute-0 sudo[180893]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tztfofyjdqxydacjmubefgwadqzehsdc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215742.119309-2399-23706112347684/AnsiballZ_command.py'
Sep 30 07:02:22 compute-0 sudo[180893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:02:22 compute-0 python3.9[180895]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 07:02:22 compute-0 sudo[180893]: pam_unix(sudo:session): session closed for user root
Sep 30 07:02:23 compute-0 sudo[181058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aiitomokgpumaaomuwrwvzubcnwlbpgl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215742.9245865-2399-21498098677767/AnsiballZ_command.py'
Sep 30 07:02:23 compute-0 sudo[181058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:02:23 compute-0 podman[181020]: 2025-09-30 07:02:23.418082802 +0000 UTC m=+0.159149393 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Sep 30 07:02:23 compute-0 python3.9[181064]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 07:02:23 compute-0 sudo[181058]: pam_unix(sudo:session): session closed for user root
Sep 30 07:02:24 compute-0 sudo[181224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frhqszldezttcvjstivssnvozaphbhby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215743.763728-2399-129536053487937/AnsiballZ_command.py'
Sep 30 07:02:24 compute-0 sudo[181224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:02:24 compute-0 podman[181226]: 2025-09-30 07:02:24.173505713 +0000 UTC m=+0.078238494 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4)
Sep 30 07:02:24 compute-0 python3.9[181227]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 07:02:24 compute-0 sudo[181224]: pam_unix(sudo:session): session closed for user root
Sep 30 07:02:25 compute-0 sudo[181398]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzxrbpktfgjuigmnuauxehknijwhkvbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215745.4561782-2542-138573294779665/AnsiballZ_file.py'
Sep 30 07:02:25 compute-0 sudo[181398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:02:26 compute-0 python3.9[181400]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 07:02:26 compute-0 sudo[181398]: pam_unix(sudo:session): session closed for user root
Sep 30 07:02:26 compute-0 podman[181524]: 2025-09-30 07:02:26.660650879 +0000 UTC m=+0.072153326 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 07:02:26 compute-0 sudo[181565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywzpvahounzjuwnudyvltagkjmvzwwph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215746.2541494-2542-44371715023491/AnsiballZ_file.py'
Sep 30 07:02:26 compute-0 sudo[181565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:02:26 compute-0 python3.9[181571]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 07:02:26 compute-0 sudo[181565]: pam_unix(sudo:session): session closed for user root
Sep 30 07:02:27 compute-0 sudo[181721]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pljwivvrlbrqsaqwsxrfteroalgylgsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215747.133083-2542-179219669016669/AnsiballZ_file.py'
Sep 30 07:02:27 compute-0 sudo[181721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:02:27 compute-0 python3.9[181723]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 07:02:27 compute-0 sudo[181721]: pam_unix(sudo:session): session closed for user root
Sep 30 07:02:28 compute-0 sudo[181873]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htdrixxbntbissxwabfybozovxmdrqaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215747.935741-2586-15051617749314/AnsiballZ_file.py'
Sep 30 07:02:28 compute-0 sudo[181873]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:02:28 compute-0 python3.9[181875]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 07:02:28 compute-0 sudo[181873]: pam_unix(sudo:session): session closed for user root
Sep 30 07:02:29 compute-0 sudo[182025]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-knrurfumfsdxueweqqkvoydatoyrjfws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215748.795385-2586-236261734985585/AnsiballZ_file.py'
Sep 30 07:02:29 compute-0 sudo[182025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:02:29 compute-0 python3.9[182027]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 07:02:29 compute-0 sudo[182025]: pam_unix(sudo:session): session closed for user root
Sep 30 07:02:30 compute-0 sudo[182177]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avymyeflkzevxamekvwoisyzkextydwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215749.6745117-2586-58206209318869/AnsiballZ_file.py'
Sep 30 07:02:30 compute-0 sudo[182177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:02:30 compute-0 python3.9[182179]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 07:02:30 compute-0 sudo[182177]: pam_unix(sudo:session): session closed for user root
Sep 30 07:02:30 compute-0 sudo[182329]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjpedzafwxxnjwhppogedvunygzbcnpx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215750.5063002-2586-142777878660167/AnsiballZ_file.py'
Sep 30 07:02:30 compute-0 sudo[182329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:02:31 compute-0 python3.9[182331]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 07:02:31 compute-0 sudo[182329]: pam_unix(sudo:session): session closed for user root
Sep 30 07:02:31 compute-0 sudo[182481]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esniqdknkturswkukwnmarddaclvvseg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215751.3030543-2586-247215653502375/AnsiballZ_file.py'
Sep 30 07:02:31 compute-0 sudo[182481]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:02:31 compute-0 python3.9[182483]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Sep 30 07:02:31 compute-0 sudo[182481]: pam_unix(sudo:session): session closed for user root
Sep 30 07:02:32 compute-0 sudo[182633]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arbswcwegdwoplbfatozxcvdwphqpozh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215752.0990925-2586-279972672760655/AnsiballZ_file.py'
Sep 30 07:02:32 compute-0 sudo[182633]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:02:32 compute-0 python3.9[182635]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Sep 30 07:02:32 compute-0 sudo[182633]: pam_unix(sudo:session): session closed for user root
Sep 30 07:02:33 compute-0 sudo[182785]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlhbuqozugbcfbltqlbofpqoowzuoiij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215752.9177349-2586-255795401851498/AnsiballZ_file.py'
Sep 30 07:02:33 compute-0 sudo[182785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:02:33 compute-0 python3.9[182787]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Sep 30 07:02:33 compute-0 sudo[182785]: pam_unix(sudo:session): session closed for user root
Sep 30 07:02:34 compute-0 sudo[182937]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rweyqnsscirwquxgnnzmkvtmhayadvut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215753.8213837-2586-186014658956344/AnsiballZ_file.py'
Sep 30 07:02:34 compute-0 sudo[182937]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:02:34 compute-0 python3.9[182939]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Sep 30 07:02:34 compute-0 sudo[182937]: pam_unix(sudo:session): session closed for user root
Sep 30 07:02:35 compute-0 sudo[183089]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bffupbtprxbjqcpctceforklofdtlgco ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215754.738767-2586-155310974891047/AnsiballZ_file.py'
Sep 30 07:02:35 compute-0 sudo[183089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:02:35 compute-0 python3.9[183091]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Sep 30 07:02:35 compute-0 sudo[183089]: pam_unix(sudo:session): session closed for user root
Sep 30 07:02:40 compute-0 sudo[183241]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbwbkhtzdxixxhmvckbthfgptyfmjtwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215759.9435287-2851-86494128803321/AnsiballZ_getent.py'
Sep 30 07:02:40 compute-0 sudo[183241]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:02:40 compute-0 python3.9[183243]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Sep 30 07:02:40 compute-0 sudo[183241]: pam_unix(sudo:session): session closed for user root
Sep 30 07:02:41 compute-0 sudo[183394]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmwouirjpaavfgiplnkhtakvbyrawygs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215760.9238324-2867-9650056161311/AnsiballZ_group.py'
Sep 30 07:02:41 compute-0 sudo[183394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:02:41 compute-0 python3.9[183396]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Sep 30 07:02:41 compute-0 groupadd[183397]: group added to /etc/group: name=nova, GID=42436
Sep 30 07:02:41 compute-0 groupadd[183397]: group added to /etc/gshadow: name=nova
Sep 30 07:02:41 compute-0 groupadd[183397]: new group: name=nova, GID=42436
Sep 30 07:02:41 compute-0 sudo[183394]: pam_unix(sudo:session): session closed for user root
Sep 30 07:02:42 compute-0 sudo[183552]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkhtaganjekxcarijnjmfbzxwdcvhjnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215762.1141481-2883-269001112940866/AnsiballZ_user.py'
Sep 30 07:02:42 compute-0 sudo[183552]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:02:42 compute-0 python3.9[183554]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Sep 30 07:02:43 compute-0 useradd[183556]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Sep 30 07:02:43 compute-0 useradd[183556]: add 'nova' to group 'libvirt'
Sep 30 07:02:43 compute-0 useradd[183556]: add 'nova' to shadow group 'libvirt'
Sep 30 07:02:43 compute-0 sudo[183552]: pam_unix(sudo:session): session closed for user root
Sep 30 07:02:44 compute-0 sshd-session[183587]: Accepted publickey for zuul from 192.168.122.30 port 56762 ssh2: ECDSA SHA256:VgXY+3KEFg6ByVjpOVk/qpSKqXtLqTtx1W0gQMfs9wE
Sep 30 07:02:44 compute-0 systemd-logind[824]: New session 27 of user zuul.
Sep 30 07:02:44 compute-0 systemd[1]: Started Session 27 of User zuul.
Sep 30 07:02:44 compute-0 sshd-session[183587]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 07:02:44 compute-0 sshd-session[183590]: Received disconnect from 192.168.122.30 port 56762:11: disconnected by user
Sep 30 07:02:44 compute-0 sshd-session[183590]: Disconnected from user zuul 192.168.122.30 port 56762
Sep 30 07:02:44 compute-0 sshd-session[183587]: pam_unix(sshd:session): session closed for user zuul
Sep 30 07:02:44 compute-0 systemd[1]: session-27.scope: Deactivated successfully.
Sep 30 07:02:44 compute-0 systemd-logind[824]: Session 27 logged out. Waiting for processes to exit.
Sep 30 07:02:44 compute-0 systemd-logind[824]: Removed session 27.
Sep 30 07:02:45 compute-0 python3.9[183740]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 07:02:45 compute-0 python3.9[183861]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759215764.4336197-2933-32180950271512/.source.json follow=False _original_basename=config.json.j2 checksum=2c2474b5f24ef7c9ed37f49680082593e0d1100b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 07:02:46 compute-0 python3.9[184011]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 07:02:47 compute-0 python3.9[184087]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 07:02:47 compute-0 python3.9[184237]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 07:02:48 compute-0 podman[184332]: 2025-09-30 07:02:48.531863323 +0000 UTC m=+0.103407294 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Sep 30 07:02:48 compute-0 python3.9[184371]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759215767.332514-2933-107561485933057/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 07:02:49 compute-0 python3.9[184528]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 07:02:50 compute-0 python3.9[184649]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759215768.9491723-2933-30994184937136/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=1feba546d0beacad9258164ab79b8a747685ccc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 07:02:51 compute-0 python3.9[184799]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 07:02:51 compute-0 python3.9[184920]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759215770.43122-2933-76067703563077/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 07:02:52 compute-0 sudo[185070]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqzbzexqnnsztxonnyvwpxtmxouypyqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215772.074589-3071-60690297953415/AnsiballZ_file.py'
Sep 30 07:02:52 compute-0 sudo[185070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:02:52 compute-0 python3.9[185072]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:02:52 compute-0 sudo[185070]: pam_unix(sudo:session): session closed for user root
Sep 30 07:02:53 compute-0 sudo[185222]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmouwatdakoacexbndhjziubxhuovyto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215772.9525843-3087-103718297090254/AnsiballZ_copy.py'
Sep 30 07:02:53 compute-0 sudo[185222]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:02:53 compute-0 python3.9[185224]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:02:53 compute-0 sudo[185222]: pam_unix(sudo:session): session closed for user root
Sep 30 07:02:53 compute-0 podman[185225]: 2025-09-30 07:02:53.719184423 +0000 UTC m=+0.115313490 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4)
Sep 30 07:02:54 compute-0 sudo[185413]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvrmfieqilknmqgaetafwbuwtnpwthpb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215773.8271894-3103-4218053198629/AnsiballZ_stat.py'
Sep 30 07:02:54 compute-0 sudo[185413]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:02:54 compute-0 podman[185374]: 2025-09-30 07:02:54.339063887 +0000 UTC m=+0.107397240 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2)
Sep 30 07:02:54 compute-0 python3.9[185420]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 07:02:54 compute-0 sudo[185413]: pam_unix(sudo:session): session closed for user root
Sep 30 07:02:55 compute-0 sudo[185573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ksmrrigvwljnsmogayzhxedpjqxutwvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215774.7828276-3119-241455301705710/AnsiballZ_stat.py'
Sep 30 07:02:55 compute-0 sudo[185573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:02:55 compute-0 python3.9[185575]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 07:02:55 compute-0 sudo[185573]: pam_unix(sudo:session): session closed for user root
Sep 30 07:02:56 compute-0 sudo[185696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djyryhgwukkbgwanzolkspercjwkcjul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215774.7828276-3119-241455301705710/AnsiballZ_copy.py'
Sep 30 07:02:56 compute-0 sudo[185696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:02:56 compute-0 python3.9[185698]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1759215774.7828276-3119-241455301705710/.source _original_basename=.hl458u_n follow=False checksum=419c70c4a80d4d55aa2d99a4638bf49bf55a6bb6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Sep 30 07:02:56 compute-0 sudo[185696]: pam_unix(sudo:session): session closed for user root
Sep 30 07:02:57 compute-0 podman[185824]: 2025-09-30 07:02:57.191555203 +0000 UTC m=+0.134222589 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 07:02:57 compute-0 python3.9[185861]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 07:02:58 compute-0 python3.9[186022]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 07:02:58 compute-0 python3.9[186143]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759215777.612334-3171-266627885985904/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=52e5c207b65a05937a65caa1823d79c347a7beb0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 07:02:59 compute-0 python3.9[186293]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 07:03:00 compute-0 python3.9[186414]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759215779.059682-3201-25011558743800/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=3cf05d68d95be002f01ec016347c8ba2745fe64a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 07:03:01 compute-0 sudo[186564]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bsgwggslsyldxjeszojweqnnjqeitpia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215780.8541346-3235-235644616247094/AnsiballZ_container_config_data.py'
Sep 30 07:03:01 compute-0 sudo[186564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:03:01 compute-0 python3.9[186566]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Sep 30 07:03:01 compute-0 sudo[186564]: pam_unix(sudo:session): session closed for user root
Sep 30 07:03:02 compute-0 sudo[186716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nhuficagnzxnveuaqauvsttfldsqpqsu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215781.7120326-3253-211344508785008/AnsiballZ_container_config_hash.py'
Sep 30 07:03:02 compute-0 sudo[186716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:03:02 compute-0 python3.9[186718]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Sep 30 07:03:02 compute-0 sudo[186716]: pam_unix(sudo:session): session closed for user root
Sep 30 07:03:03 compute-0 sudo[186868]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dcblhncedzbkfgernqlvrjljtaxjukff ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759215782.7592502-3273-218704364551610/AnsiballZ_edpm_container_manage.py'
Sep 30 07:03:03 compute-0 sudo[186868]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:03:03 compute-0 python3[186870]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Sep 30 07:03:03 compute-0 podman[186905]: 2025-09-30 07:03:03.72916519 +0000 UTC m=+0.083156756 container create cd6bef0fe6fff3da2868269e357027380332e3c870b91a5fb5d0a26e6066e8aa (image=38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, tcib_managed=true, config_data={'image': '38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_id=edpm, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, container_name=nova_compute_init, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0)
Sep 30 07:03:03 compute-0 podman[186905]: 2025-09-30 07:03:03.685502002 +0000 UTC m=+0.039493628 image pull b4e0ba921b5ecb84b5b785b68bb6d15e43854720aa99c361795320d2a08a3eee 38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest
Sep 30 07:03:03 compute-0 python3[186870]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': '38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z 38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Sep 30 07:03:03 compute-0 sudo[186868]: pam_unix(sudo:session): session closed for user root
Sep 30 07:03:04 compute-0 sudo[187093]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbygqekvifgfsbfqjomybquqdhrqmnbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215784.1727974-3289-16545893690579/AnsiballZ_stat.py'
Sep 30 07:03:04 compute-0 sudo[187093]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:03:04 compute-0 python3.9[187095]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 07:03:04 compute-0 sudo[187093]: pam_unix(sudo:session): session closed for user root
Sep 30 07:03:05 compute-0 sudo[187247]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfmbpxyorlibrkvowqnlccvbndaqlkzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215785.4283366-3313-79841880423286/AnsiballZ_container_config_data.py'
Sep 30 07:03:05 compute-0 sudo[187247]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:03:06 compute-0 python3.9[187249]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Sep 30 07:03:06 compute-0 sudo[187247]: pam_unix(sudo:session): session closed for user root
Sep 30 07:03:06 compute-0 sudo[187399]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvvnvoscknonpnoizajybfjwofkshlde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215786.3340385-3331-164576605692594/AnsiballZ_container_config_hash.py'
Sep 30 07:03:06 compute-0 sudo[187399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:03:06 compute-0 python3.9[187401]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Sep 30 07:03:06 compute-0 sudo[187399]: pam_unix(sudo:session): session closed for user root
Sep 30 07:03:07 compute-0 sudo[187551]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krarcccdwphsuohaguosodykyzpnkhqs ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759215787.3054268-3351-151940188360975/AnsiballZ_edpm_container_manage.py'
Sep 30 07:03:07 compute-0 sudo[187551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:03:08 compute-0 python3[187553]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Sep 30 07:03:08 compute-0 podman[187591]: 2025-09-30 07:03:08.327009969 +0000 UTC m=+0.072864127 container create 297f5d0b7623d2903933ffebf629d799d1cd5ed769aa04d455ac0810bc9dd2f6 (image=38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, config_data={'image': '38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, config_id=edpm, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 07:03:08 compute-0 podman[187591]: 2025-09-30 07:03:08.299054307 +0000 UTC m=+0.044908495 image pull b4e0ba921b5ecb84b5b785b68bb6d15e43854720aa99c361795320d2a08a3eee 38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest
Sep 30 07:03:08 compute-0 python3[187553]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': '38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro 38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest kolla_start
Sep 30 07:03:08 compute-0 sudo[187551]: pam_unix(sudo:session): session closed for user root
Sep 30 07:03:09 compute-0 sudo[187779]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kyouikcamttyfwyujdinjzaybxnugzil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215788.7208197-3367-278775042624478/AnsiballZ_stat.py'
Sep 30 07:03:09 compute-0 sudo[187779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:03:09 compute-0 python3.9[187781]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 07:03:09 compute-0 sudo[187779]: pam_unix(sudo:session): session closed for user root
Sep 30 07:03:10 compute-0 sudo[187933]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dylysikjdxklvstbaqjpctqnmfwpkjim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215789.6729174-3385-26637930225440/AnsiballZ_file.py'
Sep 30 07:03:10 compute-0 sudo[187933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:03:10 compute-0 python3.9[187935]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:03:10 compute-0 sudo[187933]: pam_unix(sudo:session): session closed for user root
Sep 30 07:03:10 compute-0 sudo[188084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yttlkjyaorxzkcpfdtmfvdmgsxpvgylg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215790.3534212-3385-68695198958955/AnsiballZ_copy.py'
Sep 30 07:03:10 compute-0 sudo[188084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:03:11 compute-0 python3.9[188086]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759215790.3534212-3385-68695198958955/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:03:11 compute-0 sudo[188084]: pam_unix(sudo:session): session closed for user root
Sep 30 07:03:11 compute-0 sudo[188160]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azlkngpiwgtzerjcwlbuomszvnafdxdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215790.3534212-3385-68695198958955/AnsiballZ_systemd.py'
Sep 30 07:03:11 compute-0 sudo[188160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:03:11 compute-0 python3.9[188162]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 07:03:11 compute-0 systemd[1]: Reloading.
Sep 30 07:03:11 compute-0 systemd-rc-local-generator[188190]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 07:03:12 compute-0 systemd-sysv-generator[188193]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 07:03:12 compute-0 sudo[188160]: pam_unix(sudo:session): session closed for user root
Sep 30 07:03:12 compute-0 sudo[188271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ralqhbfgocxawysdfvoaqnfotablbxir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215790.3534212-3385-68695198958955/AnsiballZ_systemd.py'
Sep 30 07:03:12 compute-0 sudo[188271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:03:12 compute-0 python3.9[188273]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 07:03:12 compute-0 systemd[1]: Reloading.
Sep 30 07:03:13 compute-0 systemd-rc-local-generator[188304]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 07:03:13 compute-0 systemd-sysv-generator[188308]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 07:03:13 compute-0 systemd[1]: Starting nova_compute container...
Sep 30 07:03:13 compute-0 systemd[1]: Started libcrun container.
Sep 30 07:03:13 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/35c4d18086d8d39aac3420067579acffd0eed0ec82f913673c6f5aef8fef34c9/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Sep 30 07:03:13 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/35c4d18086d8d39aac3420067579acffd0eed0ec82f913673c6f5aef8fef34c9/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Sep 30 07:03:13 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/35c4d18086d8d39aac3420067579acffd0eed0ec82f913673c6f5aef8fef34c9/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Sep 30 07:03:13 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/35c4d18086d8d39aac3420067579acffd0eed0ec82f913673c6f5aef8fef34c9/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Sep 30 07:03:13 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/35c4d18086d8d39aac3420067579acffd0eed0ec82f913673c6f5aef8fef34c9/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Sep 30 07:03:13 compute-0 podman[188314]: 2025-09-30 07:03:13.487427937 +0000 UTC m=+0.133139068 container init 297f5d0b7623d2903933ffebf629d799d1cd5ed769aa04d455ac0810bc9dd2f6 (image=38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'image': '38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, config_id=edpm, tcib_managed=true, container_name=nova_compute, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 07:03:13 compute-0 podman[188314]: 2025-09-30 07:03:13.49615728 +0000 UTC m=+0.141868351 container start 297f5d0b7623d2903933ffebf629d799d1cd5ed769aa04d455ac0810bc9dd2f6 (image=38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, container_name=nova_compute, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'image': '38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, tcib_managed=true)
Sep 30 07:03:13 compute-0 podman[188314]: nova_compute
Sep 30 07:03:13 compute-0 nova_compute[188329]: + sudo -E kolla_set_configs
Sep 30 07:03:13 compute-0 systemd[1]: Started nova_compute container.
Sep 30 07:03:13 compute-0 sudo[188271]: pam_unix(sudo:session): session closed for user root
Sep 30 07:03:13 compute-0 nova_compute[188329]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Sep 30 07:03:13 compute-0 nova_compute[188329]: INFO:__main__:Validating config file
Sep 30 07:03:13 compute-0 nova_compute[188329]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Sep 30 07:03:13 compute-0 nova_compute[188329]: INFO:__main__:Copying service configuration files
Sep 30 07:03:13 compute-0 nova_compute[188329]: INFO:__main__:Deleting /etc/nova/nova.conf
Sep 30 07:03:13 compute-0 nova_compute[188329]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Sep 30 07:03:13 compute-0 nova_compute[188329]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Sep 30 07:03:13 compute-0 nova_compute[188329]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Sep 30 07:03:13 compute-0 nova_compute[188329]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Sep 30 07:03:13 compute-0 nova_compute[188329]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Sep 30 07:03:13 compute-0 nova_compute[188329]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Sep 30 07:03:13 compute-0 nova_compute[188329]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Sep 30 07:03:13 compute-0 nova_compute[188329]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Sep 30 07:03:13 compute-0 nova_compute[188329]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Sep 30 07:03:13 compute-0 nova_compute[188329]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Sep 30 07:03:13 compute-0 nova_compute[188329]: INFO:__main__:Deleting /etc/ceph
Sep 30 07:03:13 compute-0 nova_compute[188329]: INFO:__main__:Creating directory /etc/ceph
Sep 30 07:03:13 compute-0 nova_compute[188329]: INFO:__main__:Setting permission for /etc/ceph
Sep 30 07:03:13 compute-0 nova_compute[188329]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Sep 30 07:03:13 compute-0 nova_compute[188329]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Sep 30 07:03:13 compute-0 nova_compute[188329]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Sep 30 07:03:13 compute-0 nova_compute[188329]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Sep 30 07:03:13 compute-0 nova_compute[188329]: INFO:__main__:Writing out command to execute
Sep 30 07:03:13 compute-0 nova_compute[188329]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Sep 30 07:03:13 compute-0 nova_compute[188329]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Sep 30 07:03:13 compute-0 nova_compute[188329]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Sep 30 07:03:13 compute-0 nova_compute[188329]: ++ cat /run_command
Sep 30 07:03:13 compute-0 nova_compute[188329]: + CMD=nova-compute
Sep 30 07:03:13 compute-0 nova_compute[188329]: + ARGS=
Sep 30 07:03:13 compute-0 nova_compute[188329]: + sudo kolla_copy_cacerts
Sep 30 07:03:13 compute-0 nova_compute[188329]: + [[ ! -n '' ]]
Sep 30 07:03:13 compute-0 nova_compute[188329]: + . kolla_extend_start
Sep 30 07:03:13 compute-0 nova_compute[188329]: Running command: 'nova-compute'
Sep 30 07:03:13 compute-0 nova_compute[188329]: + echo 'Running command: '\''nova-compute'\'''
Sep 30 07:03:13 compute-0 nova_compute[188329]: + umask 0022
Sep 30 07:03:13 compute-0 nova_compute[188329]: + exec nova-compute
Sep 30 07:03:14 compute-0 python3.9[188490]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 07:03:15 compute-0 python3.9[188640]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 07:03:15 compute-0 nova_compute[188329]: 2025-09-30 07:03:15.729 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Sep 30 07:03:15 compute-0 nova_compute[188329]: 2025-09-30 07:03:15.730 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Sep 30 07:03:15 compute-0 nova_compute[188329]: 2025-09-30 07:03:15.730 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Sep 30 07:03:15 compute-0 nova_compute[188329]: 2025-09-30 07:03:15.730 2 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Sep 30 07:03:15 compute-0 nova_compute[188329]: 2025-09-30 07:03:15.894 2 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:03:15 compute-0 nova_compute[188329]: 2025-09-30 07:03:15.924 2 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.031s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:03:15 compute-0 nova_compute[188329]: 2025-09-30 07:03:15.962 2 INFO oslo_service.periodic_task [-] Skipping periodic task _heal_instance_info_cache because its interval is negative
Sep 30 07:03:15 compute-0 nova_compute[188329]: 2025-09-30 07:03:15.964 2 WARNING oslo_config.cfg [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] Deprecated: Option "heartbeat_in_pthread" from group "oslo_messaging_rabbit" is deprecated for removal (The option is related to Eventlet which will be removed. In addition this has never worked as expected with services using eventlet for core service framework.).  Its value may be silently ignored in the future.
Sep 30 07:03:16 compute-0 python3.9[188793]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 07:03:16 compute-0 nova_compute[188329]: 2025-09-30 07:03:16.945 2 INFO nova.virt.driver [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.074 2 INFO nova.compute.provider_config [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Sep 30 07:03:17 compute-0 sudo[188945]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oibmvrryqrwjhgaklysrlekqdxgzllyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215796.958666-3505-101075013100387/AnsiballZ_podman_container.py'
Sep 30 07:03:17 compute-0 sudo[188945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.581 2 DEBUG oslo_concurrency.lockutils [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.583 2 DEBUG oslo_concurrency.lockutils [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.583 2 DEBUG oslo_concurrency.lockutils [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.584 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/service.py:274
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.584 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2804
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.584 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2805
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.584 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2806
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.585 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2807
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.585 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2809
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.586 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.586 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.586 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.586 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.587 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.587 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.587 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cell_worker_thread_pool_size   = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.587 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.588 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.588 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.588 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.589 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.589 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.589 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.589 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.590 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.590 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.590 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.591 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.591 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.591 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.591 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.592 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] default_green_pool_size        = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.592 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.592 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.592 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] default_thread_pool_size       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.593 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.593 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.593 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] fatal_deprecations             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.594 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.594 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.594 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.594 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.595 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] heal_instance_info_cache_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.595 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.595 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.596 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.596 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.596 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] injected_network_template      = /usr/lib/python3.12/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.597 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.597 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.597 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.597 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.598 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.598 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.598 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.598 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.599 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.599 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] key                            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.599 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.599 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] log_color                      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.600 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.600 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.600 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.601 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.601 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.601 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.601 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.602 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.602 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.602 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.602 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.603 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.603 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.603 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.603 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.604 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.604 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.604 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.604 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.605 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.605 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.605 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.605 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.606 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.606 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.606 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] my_shared_fs_storage_ip        = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.606 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.607 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.607 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.607 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.608 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.608 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.608 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.608 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.609 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.609 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] pybasedir                      = /usr/lib/python3.12/site-packages log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.609 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.610 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.610 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.610 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.610 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.611 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.611 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] record                         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.611 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.611 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.612 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.612 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.612 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.612 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.613 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.613 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.613 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.613 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.614 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.614 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.614 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.614 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.615 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.615 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.615 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.615 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.616 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.616 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] shell_completion               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.616 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.616 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.617 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.617 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.617 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.617 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.618 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.618 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.618 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.618 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.619 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] thread_pool_statistic_period   = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.619 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.619 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.620 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.620 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.620 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.620 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.621 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.621 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.621 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.622 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.622 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.622 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.622 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.623 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.623 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.623 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.623 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.624 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.624 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] os_brick.lock_path             = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.624 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.625 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.625 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.625 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.625 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.626 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.626 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.626 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.626 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.627 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.627 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.627 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.628 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.628 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.628 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.628 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.629 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.629 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.629 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.629 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] api.neutron_default_project_id = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.630 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] api.response_validation        = warn log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.630 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.630 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.630 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.631 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.631 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.631 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.631 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.631 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.632 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.632 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.632 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cache.backend_expiration_time  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.632 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.632 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.632 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.633 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.633 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.633 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.633 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cache.enforce_fips_mode        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.633 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.633 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.634 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.634 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.634 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cache.memcache_password        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.634 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.634 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.634 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.635 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.635 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.635 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.635 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.635 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cache.memcache_username        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.635 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.636 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cache.redis_db                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.636 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cache.redis_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.636 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cache.redis_sentinel_service_name = mymaster log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.636 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cache.redis_sentinels          = ['localhost:26379'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.636 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cache.redis_server             = localhost:6379 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.636 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cache.redis_socket_timeout     = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.637 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cache.redis_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.637 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.637 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.637 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.637 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.637 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.638 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.638 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.638 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.638 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.638 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.638 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.639 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.639 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.639 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.639 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.639 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.640 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.640 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.640 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.640 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.640 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.640 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.641 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.641 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.641 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.641 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.641 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.641 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.642 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.642 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.642 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.642 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.642 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.643 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.643 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.643 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] compute.sharing_providers_max_uuids_per_request = 200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.643 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.643 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.643 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.644 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.644 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.644 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.644 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] consoleauth.enforce_session_timeout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.644 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.644 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.645 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.645 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.645 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.645 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.645 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.645 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.646 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.646 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.646 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.646 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.646 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cyborg.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.646 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.647 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.647 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.647 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.647 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.647 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.647 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.648 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.648 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] database.asyncio_connection    = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.648 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.648 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.648 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.649 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.649 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.649 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.649 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.649 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.649 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.650 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 python3.9[188947]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.650 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.650 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.650 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.650 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.650 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.651 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.651 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.651 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.651 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.651 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.651 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] api_database.asyncio_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.652 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] api_database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.652 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.652 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.652 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.652 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.652 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.653 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.653 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.653 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.653 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.653 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.653 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.654 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.654 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.654 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.654 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.654 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.654 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.655 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.655 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.655 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.655 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.655 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] ephemeral_storage_encryption.default_format = luks log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.655 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.656 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.656 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.656 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.656 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.656 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.656 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.657 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.657 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.657 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.657 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.657 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.659 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.660 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.660 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.660 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.660 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.660 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.660 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.661 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.661 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.661 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.661 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.661 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] glance.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.661 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.662 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.662 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.662 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.662 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.662 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.662 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.663 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.663 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.663 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.663 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] manila.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.663 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] manila.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.663 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] manila.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.664 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] manila.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.664 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] manila.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.664 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] manila.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.664 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] manila.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.664 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] manila.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.664 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] manila.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.665 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] manila.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.665 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] manila.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.665 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] manila.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.665 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] manila.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.665 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] manila.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.665 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] manila.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.666 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] manila.service_type            = shared-file-system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.666 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] manila.share_apply_policy_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.666 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] manila.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.666 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] manila.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.666 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] manila.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.666 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] manila.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.667 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] manila.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.667 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] manila.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.667 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.667 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.668 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.668 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.668 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.668 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.668 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.668 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.669 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.669 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.669 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.669 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.669 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.669 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.670 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.670 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] ironic.conductor_group         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.670 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.670 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.670 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.670 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.671 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.671 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.671 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.671 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.671 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.671 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] ironic.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.672 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.672 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.672 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.672 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] ironic.shard                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.672 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.672 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.672 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.672 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.673 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.673 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.673 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.673 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.673 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.673 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.673 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.673 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.674 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.674 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.674 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.674 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.674 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.674 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.674 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.674 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.674 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.675 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.675 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.675 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.675 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.675 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.675 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.675 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.675 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.675 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.676 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.676 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.676 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.676 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.676 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vault.approle_role_id          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.676 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vault.approle_secret_id        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.676 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.676 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vault.kv_path                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.676 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.677 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.677 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vault.root_token_id            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.677 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.677 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vault.timeout                  = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.677 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.677 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.677 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.677 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.678 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.678 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.678 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.678 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.678 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.678 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.678 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.678 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.678 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.679 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] keystone.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.679 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.679 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.679 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.679 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.679 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.679 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.679 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.679 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.680 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.ceph_mount_options     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.680 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.ceph_mount_point_base  = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.680 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.680 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.680 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.680 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.680 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.680 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.681 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.681 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.681 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.681 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.681 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.681 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.681 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.681 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.681 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.682 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.682 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.682 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.682 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.682 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.682 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.682 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.682 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.683 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.683 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.683 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.683 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.683 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.683 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.683 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.683 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.684 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.684 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.684 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.684 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.684 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.684 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.684 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.684 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.685 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.685 2 WARNING oslo_config.cfg [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Sep 30 07:03:17 compute-0 nova_compute[188329]: live_migration_uri is deprecated for removal in favor of two other options that
Sep 30 07:03:17 compute-0 nova_compute[188329]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Sep 30 07:03:17 compute-0 nova_compute[188329]: and ``live_migration_inbound_addr`` respectively.
Sep 30 07:03:17 compute-0 nova_compute[188329]: ).  Its value may be silently ignored in the future.
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.685 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.685 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.685 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.685 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.685 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.migration_inbound_addr = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.686 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.686 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.686 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.686 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.686 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.686 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.686 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.687 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.687 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.687 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.687 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.687 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.687 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.688 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.688 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.688 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.688 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.688 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.688 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.688 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.688 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.688 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.689 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.689 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.689 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.689 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.689 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.689 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.689 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.689 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.690 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.690 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.690 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.690 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.tb_cache_size          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.690 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.690 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.690 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.690 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.691 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.691 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.691 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.volume_enforce_multipath = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.691 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.691 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.691 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.691 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.691 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.692 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.692 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.692 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.692 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.692 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.692 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.692 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.692 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.692 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.693 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.693 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.693 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.693 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.693 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.693 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.693 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.694 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.694 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.694 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.694 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.694 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.694 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.694 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.694 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] neutron.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.695 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.695 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.695 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.695 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.695 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.695 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.695 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.696 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.696 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.696 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.696 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.696 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] notifications.include_share_mapping = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.696 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.697 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.697 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.697 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.697 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.697 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.698 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.698 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.698 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.698 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.698 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.698 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.699 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.699 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.699 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.699 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.699 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.699 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.700 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.700 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.700 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.700 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.700 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.700 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.701 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.701 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.701 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.701 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.701 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.701 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] placement.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.701 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.701 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.702 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.702 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.702 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.702 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.702 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.702 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.702 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.702 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.702 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.703 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.703 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.703 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.703 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.703 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.703 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.703 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.703 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.704 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.704 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.704 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.704 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.704 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.704 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.704 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.704 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.705 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] quota.unified_limits_resource_list = ['servers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.705 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] quota.unified_limits_resource_strategy = require log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.705 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.705 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.705 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.705 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.705 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.706 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.706 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.706 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.706 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.706 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.706 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.706 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.707 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.707 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.707 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.707 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.707 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.707 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.707 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.707 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] filter_scheduler.hypervisor_version_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.708 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.708 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] filter_scheduler.image_props_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.708 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] filter_scheduler.image_props_weight_setting = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.708 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.708 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.708 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.708 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.708 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.708 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] filter_scheduler.num_instances_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.709 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.709 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.709 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.709 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.709 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.709 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.709 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.709 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.709 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.710 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.710 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.710 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.710 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.710 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.710 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.710 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.711 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.711 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.711 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.711 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.711 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.711 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.711 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.711 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.712 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.712 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.712 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.712 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.712 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.712 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.712 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.712 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.713 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.713 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.713 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.713 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.713 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.713 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] spice.require_secure           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.713 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.713 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.714 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] spice.spice_direct_proxy_base_url = http://127.0.0.1:13002/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.714 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.714 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.714 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.714 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.714 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.714 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.714 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.715 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.715 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.715 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.715 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.715 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.715 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.715 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.715 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.716 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.716 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.716 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.716 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.716 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.716 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.716 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.716 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.716 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.717 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.717 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.717 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.717 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.717 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.717 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.717 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.718 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.718 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.718 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.718 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.718 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.718 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.718 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.718 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.719 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.719 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.719 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.719 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.719 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.719 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.719 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.720 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.720 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.720 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.720 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.720 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.720 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.720 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.720 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.721 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.721 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.721 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.721 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.721 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.721 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.721 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.721 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.721 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.722 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.722 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.722 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.722 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.722 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.722 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.722 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.722 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.722 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.723 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.723 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.723 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.723 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.723 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.723 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.723 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.723 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.724 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.724 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.724 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.724 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.724 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.724 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.724 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.724 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.725 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_messaging_rabbit.hostname = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.725 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.725 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.725 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.725 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.725 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_splay = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.725 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_messaging_rabbit.processname = nova-compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.725 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.725 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.726 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.726 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.726 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.726 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.726 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.726 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.726 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.726 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.726 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_messaging_rabbit.rabbit_stream_fanout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.727 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.727 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_messaging_rabbit.rabbit_transient_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.727 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.727 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.727 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.727 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.727 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.727 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.727 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.728 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_messaging_rabbit.use_queue_manager = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.728 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.728 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.728 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.728 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.728 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.728 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.728 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.729 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.729 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.729 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.729 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.729 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.729 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.729 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.729 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.730 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.730 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.730 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_limit.endpoint_interface  = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.730 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.730 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_limit.endpoint_region_name = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.730 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_limit.endpoint_service_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.730 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_limit.endpoint_service_type = compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.730 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.730 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.731 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.731 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.731 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.731 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.731 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.731 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.731 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.731 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.732 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_limit.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.732 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.732 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.732 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.732 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.732 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.732 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.732 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.732 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.733 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.733 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.733 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.733 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.733 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.733 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.733 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.733 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.733 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.734 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.734 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.734 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.734 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vif_plug_linux_bridge_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.734 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.734 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.734 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.735 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.735 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.735 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.735 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vif_plug_ovs_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.735 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.735 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.736 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.736 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.736 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.736 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.736 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.736 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.736 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.737 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.737 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.737 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] os_vif_ovs.default_qos_type    = linux-noop log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.737 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.737 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.737 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.737 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.738 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.738 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.738 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] privsep_osbrick.capabilities   = [21, 2] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.738 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.738 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.738 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] privsep_osbrick.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.739 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.739 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.739 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.739 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.739 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.739 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.739 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] nova_sys_admin.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.739 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.740 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.740 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.740 2 DEBUG oslo_service.backend._eventlet.service [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2828
Sep 30 07:03:17 compute-0 nova_compute[188329]: 2025-09-30 07:03:17.741 2 INFO nova.service [-] Starting compute node (version 32.1.0-0.20250919142712.b99a882.el10)
Sep 30 07:03:17 compute-0 sudo[188945]: pam_unix(sudo:session): session closed for user root
Sep 30 07:03:18 compute-0 nova_compute[188329]: 2025-09-30 07:03:18.249 2 DEBUG nova.virt.libvirt.host [None req-4e47cec2-a39e-47ca-9f35-a0737dae72ec - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:498
Sep 30 07:03:18 compute-0 systemd[1]: Starting libvirt QEMU daemon...
Sep 30 07:03:18 compute-0 systemd[1]: Started libvirt QEMU daemon.
Sep 30 07:03:18 compute-0 nova_compute[188329]: 2025-09-30 07:03:18.335 2 DEBUG nova.virt.libvirt.host [None req-4e47cec2-a39e-47ca-9f35-a0737dae72ec - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f0d47aeb080> _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:504
Sep 30 07:03:18 compute-0 nova_compute[188329]: libvirt:  error : internal error: could not initialize domain event timer
Sep 30 07:03:18 compute-0 nova_compute[188329]: 2025-09-30 07:03:18.337 2 WARNING nova.virt.libvirt.host [None req-4e47cec2-a39e-47ca-9f35-a0737dae72ec - - - - - -] URI qemu:///system does not support events: internal error: could not initialize domain event timer: libvirt.libvirtError: internal error: could not initialize domain event timer
Sep 30 07:03:18 compute-0 nova_compute[188329]: 2025-09-30 07:03:18.338 2 DEBUG nova.virt.libvirt.host [None req-4e47cec2-a39e-47ca-9f35-a0737dae72ec - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f0d47aeb080> _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:525
Sep 30 07:03:18 compute-0 nova_compute[188329]: 2025-09-30 07:03:18.341 2 DEBUG nova.virt.libvirt.host [None req-4e47cec2-a39e-47ca-9f35-a0737dae72ec - - - - - -] Starting native event thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:484
Sep 30 07:03:18 compute-0 nova_compute[188329]: 2025-09-30 07:03:18.342 2 DEBUG nova.virt.libvirt.host [None req-4e47cec2-a39e-47ca-9f35-a0737dae72ec - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:490
Sep 30 07:03:18 compute-0 nova_compute[188329]: 2025-09-30 07:03:18.342 2 INFO nova.utils [None req-4e47cec2-a39e-47ca-9f35-a0737dae72ec - - - - - -] The default thread pool MainProcess.default is initialized
Sep 30 07:03:18 compute-0 nova_compute[188329]: 2025-09-30 07:03:18.343 2 DEBUG nova.virt.libvirt.host [None req-4e47cec2-a39e-47ca-9f35-a0737dae72ec - - - - - -] Starting connection event dispatch thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:493
Sep 30 07:03:18 compute-0 nova_compute[188329]: 2025-09-30 07:03:18.344 2 INFO nova.virt.libvirt.driver [None req-4e47cec2-a39e-47ca-9f35-a0737dae72ec - - - - - -] Connection event '1' reason 'None'
Sep 30 07:03:18 compute-0 sudo[189171]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzhpvqwfauswnjnjkyegsqfisantyxml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215798.087133-3521-81711182239918/AnsiballZ_systemd.py'
Sep 30 07:03:18 compute-0 sudo[189171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:03:18 compute-0 python3.9[189173]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 07:03:18 compute-0 nova_compute[188329]: 2025-09-30 07:03:18.853 2 WARNING nova.virt.libvirt.driver [None req-4e47cec2-a39e-47ca-9f35-a0737dae72ec - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Sep 30 07:03:18 compute-0 nova_compute[188329]: 2025-09-30 07:03:18.854 2 DEBUG nova.virt.libvirt.volume.mount [None req-4e47cec2-a39e-47ca-9f35-a0737dae72ec - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/mount.py:130
Sep 30 07:03:18 compute-0 systemd[1]: Stopping nova_compute container...
Sep 30 07:03:18 compute-0 podman[189175]: 2025-09-30 07:03:18.908539565 +0000 UTC m=+0.102657652 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Sep 30 07:03:18 compute-0 nova_compute[188329]: 2025-09-30 07:03:18.971 2 DEBUG oslo_concurrency.lockutils [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:03:18 compute-0 nova_compute[188329]: 2025-09-30 07:03:18.972 2 DEBUG oslo_concurrency.lockutils [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:03:18 compute-0 nova_compute[188329]: 2025-09-30 07:03:18.972 2 DEBUG oslo_concurrency.lockutils [None req-840467e3-1402-4296-9cda-d82abee8dd14 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:03:20 compute-0 virtqemud[189090]: libvirt version: 10.10.0, package: 15.el9 (builder@centos.org, 2025-08-18-13:22:20, )
Sep 30 07:03:20 compute-0 virtqemud[189090]: hostname: compute-0
Sep 30 07:03:20 compute-0 virtqemud[189090]: End of file while reading data: Input/output error
Sep 30 07:03:20 compute-0 systemd[1]: libpod-297f5d0b7623d2903933ffebf629d799d1cd5ed769aa04d455ac0810bc9dd2f6.scope: Deactivated successfully.
Sep 30 07:03:20 compute-0 systemd[1]: libpod-297f5d0b7623d2903933ffebf629d799d1cd5ed769aa04d455ac0810bc9dd2f6.scope: Consumed 3.552s CPU time.
Sep 30 07:03:20 compute-0 podman[189195]: 2025-09-30 07:03:20.02254803 +0000 UTC m=+1.111797022 container died 297f5d0b7623d2903933ffebf629d799d1cd5ed769aa04d455ac0810bc9dd2f6 (image=38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_data={'image': '38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, container_name=nova_compute, managed_by=edpm_ansible)
Sep 30 07:03:20 compute-0 systemd[1]: var-lib-containers-storage-overlay-35c4d18086d8d39aac3420067579acffd0eed0ec82f913673c6f5aef8fef34c9-merged.mount: Deactivated successfully.
Sep 30 07:03:20 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-297f5d0b7623d2903933ffebf629d799d1cd5ed769aa04d455ac0810bc9dd2f6-userdata-shm.mount: Deactivated successfully.
Sep 30 07:03:20 compute-0 podman[189195]: 2025-09-30 07:03:20.076833647 +0000 UTC m=+1.166082609 container cleanup 297f5d0b7623d2903933ffebf629d799d1cd5ed769aa04d455ac0810bc9dd2f6 (image=38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': '38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, container_name=nova_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4)
Sep 30 07:03:20 compute-0 podman[189195]: nova_compute
Sep 30 07:03:20 compute-0 podman[189237]: nova_compute
Sep 30 07:03:20 compute-0 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Sep 30 07:03:20 compute-0 systemd[1]: Stopped nova_compute container.
Sep 30 07:03:20 compute-0 systemd[1]: Starting nova_compute container...
Sep 30 07:03:20 compute-0 systemd[1]: Started libcrun container.
Sep 30 07:03:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/35c4d18086d8d39aac3420067579acffd0eed0ec82f913673c6f5aef8fef34c9/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Sep 30 07:03:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/35c4d18086d8d39aac3420067579acffd0eed0ec82f913673c6f5aef8fef34c9/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Sep 30 07:03:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/35c4d18086d8d39aac3420067579acffd0eed0ec82f913673c6f5aef8fef34c9/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Sep 30 07:03:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/35c4d18086d8d39aac3420067579acffd0eed0ec82f913673c6f5aef8fef34c9/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Sep 30 07:03:20 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/35c4d18086d8d39aac3420067579acffd0eed0ec82f913673c6f5aef8fef34c9/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Sep 30 07:03:20 compute-0 podman[189250]: 2025-09-30 07:03:20.313631145 +0000 UTC m=+0.123323663 container init 297f5d0b7623d2903933ffebf629d799d1cd5ed769aa04d455ac0810bc9dd2f6 (image=38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=nova_compute, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20250930, config_data={'image': '38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']})
Sep 30 07:03:20 compute-0 podman[189250]: 2025-09-30 07:03:20.323707137 +0000 UTC m=+0.133399595 container start 297f5d0b7623d2903933ffebf629d799d1cd5ed769aa04d455ac0810bc9dd2f6 (image=38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'image': '38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, config_id=edpm, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Sep 30 07:03:20 compute-0 podman[189250]: nova_compute
Sep 30 07:03:20 compute-0 nova_compute[189265]: + sudo -E kolla_set_configs
Sep 30 07:03:20 compute-0 systemd[1]: Started nova_compute container.
Sep 30 07:03:20 compute-0 sudo[189171]: pam_unix(sudo:session): session closed for user root
Sep 30 07:03:20 compute-0 nova_compute[189265]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Sep 30 07:03:20 compute-0 nova_compute[189265]: INFO:__main__:Validating config file
Sep 30 07:03:20 compute-0 nova_compute[189265]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Sep 30 07:03:20 compute-0 nova_compute[189265]: INFO:__main__:Copying service configuration files
Sep 30 07:03:20 compute-0 nova_compute[189265]: INFO:__main__:Deleting /etc/nova/nova.conf
Sep 30 07:03:20 compute-0 nova_compute[189265]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Sep 30 07:03:20 compute-0 nova_compute[189265]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Sep 30 07:03:20 compute-0 nova_compute[189265]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Sep 30 07:03:20 compute-0 nova_compute[189265]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Sep 30 07:03:20 compute-0 nova_compute[189265]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Sep 30 07:03:20 compute-0 nova_compute[189265]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Sep 30 07:03:20 compute-0 nova_compute[189265]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Sep 30 07:03:20 compute-0 nova_compute[189265]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Sep 30 07:03:20 compute-0 nova_compute[189265]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Sep 30 07:03:20 compute-0 nova_compute[189265]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Sep 30 07:03:20 compute-0 nova_compute[189265]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Sep 30 07:03:20 compute-0 nova_compute[189265]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Sep 30 07:03:20 compute-0 nova_compute[189265]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Sep 30 07:03:20 compute-0 nova_compute[189265]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Sep 30 07:03:20 compute-0 nova_compute[189265]: INFO:__main__:Deleting /etc/ceph
Sep 30 07:03:20 compute-0 nova_compute[189265]: INFO:__main__:Creating directory /etc/ceph
Sep 30 07:03:20 compute-0 nova_compute[189265]: INFO:__main__:Setting permission for /etc/ceph
Sep 30 07:03:20 compute-0 nova_compute[189265]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Sep 30 07:03:20 compute-0 nova_compute[189265]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Sep 30 07:03:20 compute-0 nova_compute[189265]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Sep 30 07:03:20 compute-0 nova_compute[189265]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Sep 30 07:03:20 compute-0 nova_compute[189265]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Sep 30 07:03:20 compute-0 nova_compute[189265]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Sep 30 07:03:20 compute-0 nova_compute[189265]: INFO:__main__:Writing out command to execute
Sep 30 07:03:20 compute-0 nova_compute[189265]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Sep 30 07:03:20 compute-0 nova_compute[189265]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Sep 30 07:03:20 compute-0 nova_compute[189265]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Sep 30 07:03:20 compute-0 nova_compute[189265]: ++ cat /run_command
Sep 30 07:03:20 compute-0 nova_compute[189265]: + CMD=nova-compute
Sep 30 07:03:20 compute-0 nova_compute[189265]: + ARGS=
Sep 30 07:03:20 compute-0 nova_compute[189265]: + sudo kolla_copy_cacerts
Sep 30 07:03:20 compute-0 nova_compute[189265]: + [[ ! -n '' ]]
Sep 30 07:03:20 compute-0 nova_compute[189265]: + . kolla_extend_start
Sep 30 07:03:20 compute-0 nova_compute[189265]: Running command: 'nova-compute'
Sep 30 07:03:20 compute-0 nova_compute[189265]: + echo 'Running command: '\''nova-compute'\'''
Sep 30 07:03:20 compute-0 nova_compute[189265]: + umask 0022
Sep 30 07:03:20 compute-0 nova_compute[189265]: + exec nova-compute
Sep 30 07:03:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:03:20.514 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:03:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:03:20.515 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:03:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:03:20.515 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:03:21 compute-0 sudo[189427]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnsyqdrjvdqssuepjvzmreemctidqsnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215800.6915486-3539-158152255399016/AnsiballZ_podman_container.py'
Sep 30 07:03:21 compute-0 sudo[189427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:03:21 compute-0 python3.9[189429]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Sep 30 07:03:21 compute-0 systemd[1]: Started libpod-conmon-cd6bef0fe6fff3da2868269e357027380332e3c870b91a5fb5d0a26e6066e8aa.scope.
Sep 30 07:03:21 compute-0 systemd[1]: Started libcrun container.
Sep 30 07:03:21 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/83d124aa4cca547088314dd8703488eb97154002037546fd05dac0744720b7cc/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Sep 30 07:03:21 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/83d124aa4cca547088314dd8703488eb97154002037546fd05dac0744720b7cc/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Sep 30 07:03:21 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/83d124aa4cca547088314dd8703488eb97154002037546fd05dac0744720b7cc/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Sep 30 07:03:21 compute-0 podman[189454]: 2025-09-30 07:03:21.564073369 +0000 UTC m=+0.139147788 container init cd6bef0fe6fff3da2868269e357027380332e3c870b91a5fb5d0a26e6066e8aa (image=38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=nova_compute_init, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team)
Sep 30 07:03:21 compute-0 podman[189454]: 2025-09-30 07:03:21.577710854 +0000 UTC m=+0.152785253 container start cd6bef0fe6fff3da2868269e357027380332e3c870b91a5fb5d0a26e6066e8aa (image=38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': '38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, container_name=nova_compute_init, io.buildah.version=1.41.4)
Sep 30 07:03:21 compute-0 python3.9[189429]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Sep 30 07:03:21 compute-0 nova_compute_init[189476]: INFO:nova_statedir:Applying nova statedir ownership
Sep 30 07:03:21 compute-0 nova_compute_init[189476]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Sep 30 07:03:21 compute-0 nova_compute_init[189476]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Sep 30 07:03:21 compute-0 nova_compute_init[189476]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Sep 30 07:03:21 compute-0 nova_compute_init[189476]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Sep 30 07:03:21 compute-0 nova_compute_init[189476]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Sep 30 07:03:21 compute-0 nova_compute_init[189476]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Sep 30 07:03:21 compute-0 nova_compute_init[189476]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Sep 30 07:03:21 compute-0 nova_compute_init[189476]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Sep 30 07:03:21 compute-0 nova_compute_init[189476]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Sep 30 07:03:21 compute-0 nova_compute_init[189476]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Sep 30 07:03:21 compute-0 nova_compute_init[189476]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Sep 30 07:03:21 compute-0 nova_compute_init[189476]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Sep 30 07:03:21 compute-0 nova_compute_init[189476]: INFO:nova_statedir:Nova statedir ownership complete
Sep 30 07:03:21 compute-0 systemd[1]: libpod-cd6bef0fe6fff3da2868269e357027380332e3c870b91a5fb5d0a26e6066e8aa.scope: Deactivated successfully.
Sep 30 07:03:21 compute-0 podman[189490]: 2025-09-30 07:03:21.697546528 +0000 UTC m=+0.040029272 container died cd6bef0fe6fff3da2868269e357027380332e3c870b91a5fb5d0a26e6066e8aa (image=38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, config_data={'image': '38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=nova_compute_init, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20250930)
Sep 30 07:03:21 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cd6bef0fe6fff3da2868269e357027380332e3c870b91a5fb5d0a26e6066e8aa-userdata-shm.mount: Deactivated successfully.
Sep 30 07:03:21 compute-0 systemd[1]: var-lib-containers-storage-overlay-83d124aa4cca547088314dd8703488eb97154002037546fd05dac0744720b7cc-merged.mount: Deactivated successfully.
Sep 30 07:03:21 compute-0 podman[189490]: 2025-09-30 07:03:21.730985132 +0000 UTC m=+0.073467856 container cleanup cd6bef0fe6fff3da2868269e357027380332e3c870b91a5fb5d0a26e6066e8aa (image=38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest, name=nova_compute_init, config_id=edpm, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'image': '38.102.83.30:5001/podified-master-centos10/openstack-nova-compute:watcher_latest', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, container_name=nova_compute_init, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 07:03:21 compute-0 systemd[1]: libpod-conmon-cd6bef0fe6fff3da2868269e357027380332e3c870b91a5fb5d0a26e6066e8aa.scope: Deactivated successfully.
Sep 30 07:03:21 compute-0 sudo[189427]: pam_unix(sudo:session): session closed for user root
Sep 30 07:03:22 compute-0 sshd-session[154773]: Connection closed by 192.168.122.30 port 53024
Sep 30 07:03:22 compute-0 sshd-session[154770]: pam_unix(sshd:session): session closed for user zuul
Sep 30 07:03:22 compute-0 systemd[1]: session-25.scope: Deactivated successfully.
Sep 30 07:03:22 compute-0 systemd[1]: session-25.scope: Consumed 2min 53.592s CPU time.
Sep 30 07:03:22 compute-0 systemd-logind[824]: Session 25 logged out. Waiting for processes to exit.
Sep 30 07:03:22 compute-0 systemd-logind[824]: Removed session 25.
Sep 30 07:03:22 compute-0 nova_compute[189265]: 2025-09-30 07:03:22.559 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Sep 30 07:03:22 compute-0 nova_compute[189265]: 2025-09-30 07:03:22.559 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Sep 30 07:03:22 compute-0 nova_compute[189265]: 2025-09-30 07:03:22.560 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.12/site-packages/os_vif/__init__.py:44
Sep 30 07:03:22 compute-0 nova_compute[189265]: 2025-09-30 07:03:22.560 2 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Sep 30 07:03:22 compute-0 nova_compute[189265]: 2025-09-30 07:03:22.736 2 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:03:22 compute-0 nova_compute[189265]: 2025-09-30 07:03:22.752 2 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.016s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:03:22 compute-0 nova_compute[189265]: 2025-09-30 07:03:22.787 2 INFO oslo_service.periodic_task [-] Skipping periodic task _heal_instance_info_cache because its interval is negative
Sep 30 07:03:22 compute-0 nova_compute[189265]: 2025-09-30 07:03:22.788 2 WARNING oslo_config.cfg [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] Deprecated: Option "heartbeat_in_pthread" from group "oslo_messaging_rabbit" is deprecated for removal (The option is related to Eventlet which will be removed. In addition this has never worked as expected with services using eventlet for core service framework.).  Its value may be silently ignored in the future.
Sep 30 07:03:23 compute-0 nova_compute[189265]: 2025-09-30 07:03:23.760 2 INFO nova.virt.driver [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Sep 30 07:03:23 compute-0 nova_compute[189265]: 2025-09-30 07:03:23.852 2 INFO nova.compute.provider_config [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.361 2 DEBUG oslo_concurrency.lockutils [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.361 2 DEBUG oslo_concurrency.lockutils [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.361 2 DEBUG oslo_concurrency.lockutils [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.362 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/service.py:274
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.362 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2804
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.362 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2805
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.362 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2806
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.363 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2807
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.363 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2809
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.363 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.363 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.363 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.363 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.364 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.364 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.364 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cell_worker_thread_pool_size   = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.364 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.364 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.365 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.365 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.365 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.365 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.365 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.365 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] console_host                   = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.366 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.366 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.366 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.366 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.366 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.367 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.367 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.367 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] default_green_pool_size        = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.367 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.367 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.367 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] default_thread_pool_size       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.368 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.368 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.368 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] fatal_deprecations             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.368 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.368 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.368 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.369 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.369 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] heal_instance_info_cache_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.369 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] host                           = compute-0.ctlplane.example.com log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.369 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.369 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.370 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.370 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] injected_network_template      = /usr/lib/python3.12/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.370 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.370 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.370 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.371 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.371 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.371 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.371 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.371 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.371 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.372 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] key                            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.372 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.372 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] log_color                      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.372 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.372 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.373 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.373 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.373 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.373 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.373 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.373 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.374 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.374 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.374 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.374 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.374 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.374 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.375 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.375 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.375 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.375 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.375 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.375 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.376 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.376 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.376 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.376 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] my_block_storage_ip            = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.376 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] my_ip                          = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.377 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] my_shared_fs_storage_ip        = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.377 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.377 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.377 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.378 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.378 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.378 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.378 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.378 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.378 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.379 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] pybasedir                      = /usr/lib/python3.12/site-packages log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.379 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.379 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.379 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.379 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.380 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.380 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.380 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] record                         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.380 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.380 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.381 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.381 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.381 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.381 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.381 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.382 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.382 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.382 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.382 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.382 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.382 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.383 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.383 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.383 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.383 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.383 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.383 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.384 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] shell_completion               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.384 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.384 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.384 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.384 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.385 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.385 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.385 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.385 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.385 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.385 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.385 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] thread_pool_statistic_period   = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.386 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.386 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.386 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.386 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.386 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.387 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.387 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.387 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.387 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.387 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.387 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.389 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.389 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.390 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.390 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.390 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2817
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.391 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.391 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.391 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] os_brick.lock_path             = /var/lib/nova/tmp log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.392 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.392 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.392 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.392 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.393 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.393 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.393 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.394 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.394 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.394 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.394 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.395 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.395 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.395 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.395 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.396 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.396 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.396 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.397 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.397 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] api.neutron_default_project_id = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.397 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] api.response_validation        = warn log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.397 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.398 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.398 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.398 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.398 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.399 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.399 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.399 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.399 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.400 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.400 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cache.backend_expiration_time  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.400 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.401 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.401 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.401 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.401 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.402 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.402 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cache.enforce_fips_mode        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.402 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.402 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.403 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.403 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.403 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cache.memcache_password        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.403 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.404 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.404 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.404 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.404 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.405 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.405 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.405 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cache.memcache_username        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.406 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.406 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cache.redis_db                 = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.406 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cache.redis_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.406 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cache.redis_sentinel_service_name = mymaster log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.407 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cache.redis_sentinels          = ['localhost:26379'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.407 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cache.redis_server             = localhost:6379 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.407 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cache.redis_socket_timeout     = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.408 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cache.redis_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.408 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.408 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.408 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.409 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.409 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.409 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.409 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.410 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.410 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.410 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.410 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.411 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.411 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.411 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.412 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.412 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.412 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.412 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.413 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.413 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.413 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.414 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.414 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.414 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.414 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.415 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.415 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.415 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.415 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.416 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.416 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.416 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.416 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.417 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.417 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.417 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] compute.sharing_providers_max_uuids_per_request = 200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.418 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.418 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.418 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.418 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.419 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.419 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.419 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] consoleauth.enforce_session_timeout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.420 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.420 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.420 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.420 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.421 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.421 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.421 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.421 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.422 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.422 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.422 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.422 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.423 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cyborg.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.423 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.423 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.424 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.424 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.424 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.424 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.425 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.425 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.426 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] database.asyncio_connection    = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.426 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.427 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.427 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.428 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.428 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.428 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.429 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.429 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.429 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.430 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.430 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.430 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.430 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.431 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.431 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.431 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.432 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.432 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.432 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.432 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.433 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] api_database.asyncio_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.433 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] api_database.asyncio_slave_connection = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.433 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.434 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.434 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.434 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.434 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.435 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.435 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.435 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.435 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.436 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.436 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.436 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.436 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.437 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.437 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.437 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.437 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.438 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.438 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.438 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.439 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.439 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] ephemeral_storage_encryption.default_format = luks log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.439 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.439 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.440 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.440 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.440 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.440 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.441 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.441 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.441 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.441 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.441 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.442 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.444 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.444 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.444 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.444 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.445 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.445 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.445 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.445 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.446 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.446 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.446 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.446 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] glance.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.446 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.446 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.447 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.447 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.447 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.447 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.448 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.448 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.448 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.448 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.448 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] manila.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.449 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] manila.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.449 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] manila.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.449 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] manila.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.449 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] manila.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.449 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] manila.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.450 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] manila.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.450 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] manila.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.450 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] manila.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.450 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] manila.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.450 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] manila.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.450 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] manila.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.451 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] manila.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.451 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] manila.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.451 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] manila.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.451 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] manila.service_type            = shared-file-system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.452 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] manila.share_apply_policy_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.452 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] manila.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.452 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] manila.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.452 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] manila.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.452 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] manila.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.452 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] manila.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.453 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] manila.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.453 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.453 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.453 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.453 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.454 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.454 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.454 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.454 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.454 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.455 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.455 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.455 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.455 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.455 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.455 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.456 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] ironic.conductor_group         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.456 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.456 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.456 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.456 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.456 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.457 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.457 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.457 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.457 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.457 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] ironic.retriable_status_codes  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.457 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.458 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.458 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.458 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] ironic.shard                   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.458 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.458 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.458 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.459 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.459 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.459 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.459 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.459 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.459 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.460 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.460 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.460 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.460 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.460 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.461 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.461 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.461 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.461 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.461 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.461 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.461 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.462 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.462 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.462 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.462 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.462 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.462 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.463 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.463 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.463 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.463 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.463 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.464 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.464 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.464 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vault.approle_role_id          = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.464 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vault.approle_secret_id        = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.464 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.465 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vault.kv_path                  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.465 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.465 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.465 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vault.root_token_id            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.465 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.465 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vault.timeout                  = 60.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.465 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.466 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.466 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.466 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.466 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.466 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.467 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.467 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.467 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.467 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.467 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.467 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.468 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.468 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] keystone.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.468 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.468 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.468 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.469 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.469 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.469 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.469 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.469 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.470 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.ceph_mount_options     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.470 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.ceph_mount_point_base  = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.470 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.470 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.471 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.471 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.471 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.471 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.471 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.471 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.472 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.472 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.472 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.472 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.472 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.472 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.473 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.473 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.473 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.473 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.473 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.474 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.474 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.474 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.474 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.474 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.474 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.475 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.475 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.475 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.475 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.475 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.475 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.476 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.476 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.476 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.476 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.476 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.477 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.477 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.477 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.477 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.477 2 WARNING oslo_config.cfg [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Sep 30 07:03:24 compute-0 nova_compute[189265]: live_migration_uri is deprecated for removal in favor of two other options that
Sep 30 07:03:24 compute-0 nova_compute[189265]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Sep 30 07:03:24 compute-0 nova_compute[189265]: and ``live_migration_inbound_addr`` respectively.
Sep 30 07:03:24 compute-0 nova_compute[189265]: ).  Its value may be silently ignored in the future.
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.478 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.478 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.478 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.478 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.478 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.migration_inbound_addr = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.478 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.478 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.479 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.479 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.479 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.479 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.479 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.479 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.479 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.479 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.479 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.480 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.480 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.480 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.480 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.480 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.480 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.480 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.480 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.481 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.481 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.481 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.481 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.481 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.481 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.481 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.482 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.482 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.482 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.482 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.482 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.482 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.482 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.482 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.tb_cache_size          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.483 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.483 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.483 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.483 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.483 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.483 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.483 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.volume_enforce_multipath = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.483 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.484 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.484 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.484 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.484 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.484 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.484 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.484 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.484 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.485 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.485 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.485 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.485 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.485 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.485 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.486 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.486 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.486 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.486 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.486 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.486 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.486 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.487 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.487 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.487 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.487 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.487 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.487 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.487 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] neutron.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.487 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.488 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.488 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.488 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.488 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.488 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.488 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.488 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.489 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.489 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.489 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.489 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] notifications.include_share_mapping = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.489 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] notifications.notification_format = both log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.489 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] notifications.notify_on_state_change = vm_and_task_state log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.489 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.489 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.490 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.490 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.490 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.490 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.490 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.490 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.490 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.490 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.491 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.491 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.491 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.491 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.491 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.491 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.491 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.491 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.491 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.492 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.492 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.492 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.492 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.492 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.492 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.492 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.492 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.493 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] placement.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.493 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.493 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.493 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.493 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.493 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.493 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.493 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.494 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.494 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.494 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.494 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.494 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.494 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.494 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.494 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.494 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.495 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.495 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.495 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.495 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.495 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.495 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.495 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.495 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.496 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.496 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.496 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.496 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] quota.unified_limits_resource_list = ['servers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.496 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] quota.unified_limits_resource_strategy = require log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.496 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.497 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.497 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.497 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.497 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.497 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.497 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.497 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.497 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.498 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.498 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.498 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.498 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.498 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.498 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.498 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.498 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.499 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.499 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.499 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] filter_scheduler.hypervisor_version_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.499 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.499 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] filter_scheduler.image_props_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.499 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] filter_scheduler.image_props_weight_setting = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.499 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.499 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.500 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.500 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.500 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.500 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] filter_scheduler.num_instances_weight_multiplier = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.500 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.500 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.500 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.500 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.500 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.501 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.501 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.501 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.501 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.501 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.501 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.501 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.501 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.502 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.502 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.502 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.502 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.502 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.502 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.502 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.503 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.503 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.503 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.503 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.503 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.503 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.503 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.503 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.504 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.504 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.504 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.504 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.504 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.504 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.504 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.504 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.505 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.505 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] spice.require_secure           = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.505 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.505 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.505 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] spice.spice_direct_proxy_base_url = http://127.0.0.1:13002/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.505 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.505 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.506 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.506 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.506 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.506 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.506 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.506 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.506 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.507 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.507 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.507 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.507 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.507 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.507 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.508 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.508 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.508 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.508 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.508 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.508 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.508 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.508 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.508 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.509 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.509 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.509 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.509 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.509 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.509 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.509 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.509 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.510 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.510 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.510 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.510 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.510 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.510 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.510 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.510 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.511 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.511 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.511 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.511 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.511 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.511 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.511 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vnc.server_proxyclient_address = 192.168.122.100 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.512 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.512 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.512 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.512 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.512 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.512 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 podman[189547]: 2025-09-30 07:03:24.514508013 +0000 UTC m=+0.087022129 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=multipathd, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.512 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.512 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.513 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.513 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.513 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.513 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.513 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.513 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.513 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.513 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.514 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.514 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.514 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.514 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.514 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.514 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.514 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.514 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.514 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.515 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.515 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.515 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.515 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.515 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.515 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.515 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.516 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.516 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.516 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.516 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.516 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.516 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.516 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.516 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 3 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.517 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.517 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_messaging_rabbit.hostname = compute-0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.517 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.517 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.517 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.517 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.517 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_splay = 0.0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.517 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_messaging_rabbit.processname = nova-compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.518 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.518 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.518 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.518 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.518 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.518 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.518 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.518 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.519 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.519 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.519 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_messaging_rabbit.rabbit_stream_fanout = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.519 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.519 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_messaging_rabbit.rabbit_transient_quorum_queue = True log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.519 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.519 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.519 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.520 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.520 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.520 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.520 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.520 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_messaging_rabbit.use_queue_manager = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.520 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_messaging_notifications.driver = ['messagingv2'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.520 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.520 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.521 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.521 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.521 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.521 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.521 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.521 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.521 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.521 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.522 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.522 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.522 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.522 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.522 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.522 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.522 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_limit.endpoint_interface  = internal log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.522 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.523 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_limit.endpoint_region_name = regionOne log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.523 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_limit.endpoint_service_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.523 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_limit.endpoint_service_type = compute log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.523 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.523 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.523 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.523 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.523 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.523 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.524 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.524 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.524 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.524 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.524 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_limit.retriable_status_codes = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.524 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.524 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.524 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.524 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.525 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.525 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.525 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.525 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.525 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.525 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.525 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.525 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.526 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.526 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.526 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.526 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.526 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.526 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.526 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.526 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.527 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vif_plug_linux_bridge_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.527 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.527 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.527 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.527 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.527 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.527 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.527 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vif_plug_ovs_privileged.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.527 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.528 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.528 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.528 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.528 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.528 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.528 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.529 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.529 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.529 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.529 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.529 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] os_vif_ovs.default_qos_type    = linux-noop log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.529 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.530 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.530 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.530 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.530 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.530 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.530 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] privsep_osbrick.capabilities   = [21, 2] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.531 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.531 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.531 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] privsep_osbrick.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.531 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.531 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.531 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.531 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.531 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.531 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.532 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] nova_sys_admin.log_daemon_traceback = False log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.532 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.532 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.532 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2824
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.532 2 DEBUG oslo_service.backend._eventlet.service [None req-796fb4c5-ae51-40be-9113-575aa643bc45 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.12/site-packages/oslo_config/cfg.py:2828
Sep 30 07:03:24 compute-0 nova_compute[189265]: 2025-09-30 07:03:24.533 2 INFO nova.service [-] Starting compute node (version 32.1.0-0.20250919142712.b99a882.el10)
Sep 30 07:03:24 compute-0 podman[189548]: 2025-09-30 07:03:24.543278648 +0000 UTC m=+0.115597268 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Sep 30 07:03:25 compute-0 nova_compute[189265]: 2025-09-30 07:03:25.049 2 DEBUG nova.virt.libvirt.host [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:498
Sep 30 07:03:25 compute-0 nova_compute[189265]: 2025-09-30 07:03:25.060 2 DEBUG nova.virt.libvirt.host [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f4ec4f96840> _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:504
Sep 30 07:03:25 compute-0 nova_compute[189265]: libvirt:  error : internal error: could not initialize domain event timer
Sep 30 07:03:25 compute-0 nova_compute[189265]: 2025-09-30 07:03:25.061 2 WARNING nova.virt.libvirt.host [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] URI qemu:///system does not support events: internal error: could not initialize domain event timer: libvirt.libvirtError: internal error: could not initialize domain event timer
Sep 30 07:03:25 compute-0 nova_compute[189265]: 2025-09-30 07:03:25.061 2 DEBUG nova.virt.libvirt.host [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f4ec4f96840> _get_new_connection /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:525
Sep 30 07:03:25 compute-0 nova_compute[189265]: 2025-09-30 07:03:25.062 2 DEBUG nova.virt.libvirt.host [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] Starting native event thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:484
Sep 30 07:03:25 compute-0 nova_compute[189265]: 2025-09-30 07:03:25.063 2 DEBUG nova.virt.libvirt.host [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:490
Sep 30 07:03:25 compute-0 nova_compute[189265]: 2025-09-30 07:03:25.063 2 INFO nova.utils [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] The default thread pool MainProcess.default is initialized
Sep 30 07:03:25 compute-0 nova_compute[189265]: 2025-09-30 07:03:25.064 2 DEBUG nova.virt.libvirt.host [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] Starting connection event dispatch thread _init_events /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:493
Sep 30 07:03:25 compute-0 nova_compute[189265]: 2025-09-30 07:03:25.064 2 INFO nova.virt.libvirt.driver [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] Connection event '1' reason 'None'
Sep 30 07:03:25 compute-0 nova_compute[189265]: 2025-09-30 07:03:25.070 2 INFO nova.virt.libvirt.host [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] Libvirt host capabilities <capabilities>
Sep 30 07:03:25 compute-0 nova_compute[189265]: 
Sep 30 07:03:25 compute-0 nova_compute[189265]:   <host>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <uuid>27011b05-154b-49a8-b7af-428b3312a4f6</uuid>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <cpu>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <arch>x86_64</arch>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model>EPYC-Rome-v4</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <vendor>AMD</vendor>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <microcode version='16777317'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <signature family='23' model='49' stepping='0'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <maxphysaddr mode='emulate' bits='40'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature name='x2apic'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature name='tsc-deadline'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature name='osxsave'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature name='hypervisor'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature name='tsc_adjust'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature name='spec-ctrl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature name='stibp'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature name='arch-capabilities'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature name='ssbd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature name='cmp_legacy'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature name='topoext'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature name='virt-ssbd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature name='lbrv'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature name='tsc-scale'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature name='vmcb-clean'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature name='pause-filter'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature name='pfthreshold'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature name='svme-addr-chk'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature name='rdctl-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature name='skip-l1dfl-vmentry'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature name='mds-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature name='pschange-mc-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <pages unit='KiB' size='4'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <pages unit='KiB' size='2048'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <pages unit='KiB' size='1048576'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </cpu>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <power_management>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <suspend_mem/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <suspend_disk/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <suspend_hybrid/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </power_management>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <iommu support='no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <migration_features>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <live/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <uri_transports>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <uri_transport>tcp</uri_transport>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <uri_transport>rdma</uri_transport>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </uri_transports>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </migration_features>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <topology>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <cells num='1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <cell id='0'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:           <memory unit='KiB'>7864116</memory>
Sep 30 07:03:25 compute-0 nova_compute[189265]:           <pages unit='KiB' size='4'>1966029</pages>
Sep 30 07:03:25 compute-0 nova_compute[189265]:           <pages unit='KiB' size='2048'>0</pages>
Sep 30 07:03:25 compute-0 nova_compute[189265]:           <pages unit='KiB' size='1048576'>0</pages>
Sep 30 07:03:25 compute-0 nova_compute[189265]:           <distances>
Sep 30 07:03:25 compute-0 nova_compute[189265]:             <sibling id='0' value='10'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:           </distances>
Sep 30 07:03:25 compute-0 nova_compute[189265]:           <cpus num='8'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:           </cpus>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         </cell>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </cells>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </topology>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <cache>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </cache>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <secmodel>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model>selinux</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <doi>0</doi>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </secmodel>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <secmodel>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model>dac</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <doi>0</doi>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <baselabel type='kvm'>+107:+107</baselabel>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <baselabel type='qemu'>+107:+107</baselabel>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </secmodel>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   </host>
Sep 30 07:03:25 compute-0 nova_compute[189265]: 
Sep 30 07:03:25 compute-0 nova_compute[189265]:   <guest>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <os_type>hvm</os_type>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <arch name='i686'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <wordsize>32</wordsize>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <domain type='qemu'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <domain type='kvm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </arch>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <features>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <pae/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <nonpae/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <acpi default='on' toggle='yes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <apic default='on' toggle='no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <cpuselection/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <deviceboot/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <disksnapshot default='on' toggle='no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <externalSnapshot/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </features>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   </guest>
Sep 30 07:03:25 compute-0 nova_compute[189265]: 
Sep 30 07:03:25 compute-0 nova_compute[189265]:   <guest>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <os_type>hvm</os_type>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <arch name='x86_64'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <wordsize>64</wordsize>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <domain type='qemu'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <domain type='kvm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </arch>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <features>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <acpi default='on' toggle='yes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <apic default='on' toggle='no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <cpuselection/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <deviceboot/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <disksnapshot default='on' toggle='no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <externalSnapshot/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </features>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   </guest>
Sep 30 07:03:25 compute-0 nova_compute[189265]: 
Sep 30 07:03:25 compute-0 nova_compute[189265]: </capabilities>
Sep 30 07:03:25 compute-0 nova_compute[189265]: 
Sep 30 07:03:25 compute-0 nova_compute[189265]: 2025-09-30 07:03:25.076 2 DEBUG nova.virt.libvirt.host [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:944
Sep 30 07:03:25 compute-0 nova_compute[189265]: 2025-09-30 07:03:25.092 2 DEBUG nova.virt.libvirt.host [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Sep 30 07:03:25 compute-0 nova_compute[189265]: <domainCapabilities>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   <path>/usr/libexec/qemu-kvm</path>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   <domain>kvm</domain>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   <machine>pc-q35-rhel9.6.0</machine>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   <arch>i686</arch>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   <vcpu max='4096'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   <iothreads supported='yes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   <os supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <enum name='firmware'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <loader supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='type'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>rom</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>pflash</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='readonly'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>yes</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>no</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='secure'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>no</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </loader>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   </os>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   <cpu>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <mode name='host-passthrough' supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='hostPassthroughMigratable'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>on</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>off</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </mode>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <mode name='maximum' supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='maximumMigratable'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>on</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>off</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </mode>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <mode name='host-model' supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model fallback='forbid'>EPYC-Rome</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <vendor>AMD</vendor>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <maxphysaddr mode='passthrough' limit='40'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='x2apic'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='tsc-deadline'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='hypervisor'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='tsc_adjust'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='spec-ctrl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='stibp'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='arch-capabilities'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='ssbd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='cmp_legacy'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='overflow-recov'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='succor'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='ibrs'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='amd-ssbd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='virt-ssbd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='lbrv'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='tsc-scale'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='vmcb-clean'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='flushbyasid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='pause-filter'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='pfthreshold'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='svme-addr-chk'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='lfence-always-serializing'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='rdctl-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='mds-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='pschange-mc-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='gds-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='rfds-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='disable' name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </mode>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <mode name='custom' supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Broadwell'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Broadwell-IBRS'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Broadwell-noTSX'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Broadwell-noTSX-IBRS'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Broadwell-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Broadwell-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Broadwell-v3'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Broadwell-v4'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Cascadelake-Server'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Cascadelake-Server-noTSX'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Cascadelake-Server-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Cascadelake-Server-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Cascadelake-Server-v3'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Cascadelake-Server-v4'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Cascadelake-Server-v5'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Cooperlake'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Cooperlake-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Cooperlake-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Denverton'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='mpx'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Denverton-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='mpx'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Denverton-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Denverton-v3'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Dhyana-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='EPYC-Genoa'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amd-psfd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='auto-ibrs'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='no-nested-data-bp'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='null-sel-clr-base'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='stibp-always-on'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='EPYC-Genoa-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amd-psfd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='auto-ibrs'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='no-nested-data-bp'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='null-sel-clr-base'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='stibp-always-on'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='EPYC-Milan'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='EPYC-Milan-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='EPYC-Milan-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amd-psfd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='no-nested-data-bp'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='null-sel-clr-base'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='stibp-always-on'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='EPYC-Rome'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='EPYC-Rome-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='EPYC-Rome-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='EPYC-Rome-v3'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='EPYC-v3'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='EPYC-v4'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='GraniteRapids'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-fp16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-int8'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-tile'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-fp16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='bus-lock-detect'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fbsdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrc'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrs'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fzrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='mcdt-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pbrsb-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='prefetchiti'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='psdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='sbdr-ssdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='serialize'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='tsx-ldtrk'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xfd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='GraniteRapids-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-fp16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-int8'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-tile'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-fp16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='bus-lock-detect'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fbsdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrc'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrs'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fzrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='mcdt-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pbrsb-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='prefetchiti'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='psdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='sbdr-ssdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='serialize'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='tsx-ldtrk'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xfd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='GraniteRapids-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-fp16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-int8'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-tile'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx10'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx10-128'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx10-256'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx10-512'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-fp16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='bus-lock-detect'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='cldemote'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fbsdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrc'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrs'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fzrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='mcdt-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdir64b'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdiri'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pbrsb-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='prefetchiti'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='psdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='sbdr-ssdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='serialize'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ss'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='tsx-ldtrk'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xfd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Haswell'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Haswell-IBRS'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Haswell-noTSX'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Haswell-noTSX-IBRS'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Haswell-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Haswell-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Haswell-v3'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Haswell-v4'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Icelake-Server'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Icelake-Server-noTSX'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Icelake-Server-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Icelake-Server-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Icelake-Server-v3'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Icelake-Server-v4'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Icelake-Server-v5'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Icelake-Server-v6'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Icelake-Server-v7'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='IvyBridge'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='IvyBridge-IBRS'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='IvyBridge-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='IvyBridge-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='KnightsMill'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-4fmaps'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-4vnniw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512er'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512pf'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ss'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='KnightsMill-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-4fmaps'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-4vnniw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512er'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512pf'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ss'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Opteron_G4'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fma4'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xop'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Opteron_G4-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fma4'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xop'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Opteron_G5'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fma4'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='tbm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xop'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Opteron_G5-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fma4'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='tbm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xop'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='SapphireRapids'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-int8'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-tile'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-fp16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='bus-lock-detect'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrc'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrs'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fzrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='serialize'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='tsx-ldtrk'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xfd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='SapphireRapids-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-int8'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-tile'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-fp16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='bus-lock-detect'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrc'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrs'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fzrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='serialize'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='tsx-ldtrk'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xfd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='SapphireRapids-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-int8'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-tile'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-fp16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='bus-lock-detect'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fbsdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrc'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrs'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fzrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='psdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='sbdr-ssdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='serialize'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='tsx-ldtrk'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xfd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='SapphireRapids-v3'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-int8'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-tile'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-fp16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='bus-lock-detect'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='cldemote'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fbsdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrc'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrs'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fzrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdir64b'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdiri'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='psdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='sbdr-ssdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='serialize'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ss'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='tsx-ldtrk'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xfd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='SierraForest'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-ne-convert'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-vnni-int8'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='bus-lock-detect'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='cmpccxadd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fbsdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrs'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='mcdt-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pbrsb-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='psdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='sbdr-ssdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='serialize'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='SierraForest-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-ne-convert'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-vnni-int8'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='bus-lock-detect'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='cmpccxadd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fbsdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrs'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='mcdt-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pbrsb-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='psdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='sbdr-ssdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='serialize'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Client'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Client-IBRS'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Client-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Client-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Client-v3'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Client-v4'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Server'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Server-IBRS'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Server-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Server-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Server-v3'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Server-v4'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Server-v5'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Snowridge'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='cldemote'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='core-capability'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdir64b'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdiri'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='mpx'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='split-lock-detect'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Snowridge-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='cldemote'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='core-capability'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdir64b'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdiri'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='mpx'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='split-lock-detect'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Snowridge-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='cldemote'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='core-capability'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdir64b'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdiri'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='split-lock-detect'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Snowridge-v3'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='cldemote'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='core-capability'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdir64b'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdiri'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='split-lock-detect'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Snowridge-v4'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='cldemote'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdir64b'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdiri'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='athlon'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='3dnow'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='3dnowext'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='athlon-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='3dnow'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='3dnowext'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='core2duo'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ss'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='core2duo-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ss'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='coreduo'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ss'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='coreduo-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ss'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='n270'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ss'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='n270-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ss'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='phenom'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='3dnow'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='3dnowext'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='phenom-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='3dnow'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='3dnowext'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </mode>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   </cpu>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   <memoryBacking supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <enum name='sourceType'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <value>file</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <value>anonymous</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <value>memfd</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   </memoryBacking>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   <devices>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <disk supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='diskDevice'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>disk</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>cdrom</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>floppy</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>lun</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='bus'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>fdc</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>scsi</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>virtio</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>usb</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>sata</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='model'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>virtio</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>virtio-transitional</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>virtio-non-transitional</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </disk>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <graphics supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='type'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>vnc</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>egl-headless</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>dbus</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </graphics>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <video supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='modelType'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>vga</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>cirrus</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>virtio</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>none</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>bochs</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>ramfb</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </video>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <hostdev supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='mode'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>subsystem</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='startupPolicy'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>default</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>mandatory</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>requisite</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>optional</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='subsysType'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>usb</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>pci</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>scsi</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='capsType'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='pciBackend'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </hostdev>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <rng supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='model'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>virtio</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>virtio-transitional</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>virtio-non-transitional</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='backendModel'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>random</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>egd</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>builtin</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </rng>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <filesystem supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='driverType'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>path</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>handle</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>virtiofs</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </filesystem>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <tpm supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='model'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>tpm-tis</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>tpm-crb</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='backendModel'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>emulator</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>external</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='backendVersion'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>2.0</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </tpm>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <redirdev supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='bus'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>usb</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </redirdev>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <channel supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='type'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>pty</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>unix</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </channel>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <crypto supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='model'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='type'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>qemu</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='backendModel'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>builtin</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </crypto>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <interface supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='backendType'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>default</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>passt</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </interface>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <panic supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='model'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>isa</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>hyperv</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </panic>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   </devices>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   <features>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <gic supported='no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <vmcoreinfo supported='yes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <genid supported='yes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <backingStoreInput supported='yes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <backup supported='yes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <async-teardown supported='yes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <ps2 supported='yes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <sev supported='no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <sgx supported='no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <hyperv supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='features'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>relaxed</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>vapic</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>spinlocks</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>vpindex</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>runtime</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>synic</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>stimer</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>reset</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>vendor_id</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>frequencies</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>reenlightenment</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>tlbflush</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>ipi</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>avic</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>emsr_bitmap</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>xmm_input</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </hyperv>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <launchSecurity supported='no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   </features>
Sep 30 07:03:25 compute-0 nova_compute[189265]: </domainCapabilities>
Sep 30 07:03:25 compute-0 nova_compute[189265]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Sep 30 07:03:25 compute-0 nova_compute[189265]: 2025-09-30 07:03:25.107 2 DEBUG nova.virt.libvirt.host [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Sep 30 07:03:25 compute-0 nova_compute[189265]: <domainCapabilities>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   <path>/usr/libexec/qemu-kvm</path>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   <domain>kvm</domain>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   <machine>pc-i440fx-rhel7.6.0</machine>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   <arch>i686</arch>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   <vcpu max='240'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   <iothreads supported='yes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   <os supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <enum name='firmware'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <loader supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='type'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>rom</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>pflash</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='readonly'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>yes</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>no</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='secure'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>no</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </loader>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   </os>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   <cpu>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <mode name='host-passthrough' supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='hostPassthroughMigratable'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>on</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>off</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </mode>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <mode name='maximum' supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='maximumMigratable'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>on</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>off</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </mode>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <mode name='host-model' supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model fallback='forbid'>EPYC-Rome</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <vendor>AMD</vendor>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <maxphysaddr mode='passthrough' limit='40'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='x2apic'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='tsc-deadline'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='hypervisor'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='tsc_adjust'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='spec-ctrl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='stibp'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='arch-capabilities'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='ssbd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='cmp_legacy'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='overflow-recov'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='succor'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='ibrs'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='amd-ssbd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='virt-ssbd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='lbrv'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='tsc-scale'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='vmcb-clean'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='flushbyasid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='pause-filter'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='pfthreshold'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='svme-addr-chk'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='lfence-always-serializing'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='rdctl-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='mds-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='pschange-mc-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='gds-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='rfds-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='disable' name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </mode>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <mode name='custom' supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Broadwell'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Broadwell-IBRS'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Broadwell-noTSX'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Broadwell-noTSX-IBRS'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Broadwell-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Broadwell-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Broadwell-v3'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Broadwell-v4'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Cascadelake-Server'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Cascadelake-Server-noTSX'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Cascadelake-Server-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Cascadelake-Server-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Cascadelake-Server-v3'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Cascadelake-Server-v4'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Cascadelake-Server-v5'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Cooperlake'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Cooperlake-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Cooperlake-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Denverton'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='mpx'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Denverton-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='mpx'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Denverton-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Denverton-v3'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Dhyana-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='EPYC-Genoa'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amd-psfd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='auto-ibrs'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='no-nested-data-bp'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='null-sel-clr-base'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='stibp-always-on'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='EPYC-Genoa-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amd-psfd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='auto-ibrs'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='no-nested-data-bp'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='null-sel-clr-base'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='stibp-always-on'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='EPYC-Milan'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='EPYC-Milan-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='EPYC-Milan-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amd-psfd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='no-nested-data-bp'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='null-sel-clr-base'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='stibp-always-on'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='EPYC-Rome'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='EPYC-Rome-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='EPYC-Rome-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='EPYC-Rome-v3'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='EPYC-v3'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='EPYC-v4'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='GraniteRapids'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-fp16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-int8'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-tile'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-fp16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='bus-lock-detect'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fbsdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrc'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrs'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fzrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='mcdt-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pbrsb-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='prefetchiti'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='psdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='sbdr-ssdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='serialize'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='tsx-ldtrk'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xfd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='GraniteRapids-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-fp16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-int8'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-tile'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-fp16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='bus-lock-detect'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fbsdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrc'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrs'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fzrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='mcdt-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pbrsb-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='prefetchiti'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='psdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='sbdr-ssdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='serialize'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='tsx-ldtrk'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xfd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='GraniteRapids-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-fp16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-int8'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-tile'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx10'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx10-128'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx10-256'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx10-512'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-fp16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='bus-lock-detect'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='cldemote'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fbsdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrc'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrs'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fzrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='mcdt-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdir64b'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdiri'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pbrsb-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='prefetchiti'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='psdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='sbdr-ssdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='serialize'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ss'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='tsx-ldtrk'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xfd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Haswell'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Haswell-IBRS'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Haswell-noTSX'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Haswell-noTSX-IBRS'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Haswell-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Haswell-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Haswell-v3'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Haswell-v4'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Icelake-Server'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Icelake-Server-noTSX'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Icelake-Server-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Icelake-Server-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Icelake-Server-v3'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Icelake-Server-v4'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Icelake-Server-v5'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Icelake-Server-v6'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Icelake-Server-v7'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='IvyBridge'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='IvyBridge-IBRS'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='IvyBridge-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='IvyBridge-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='KnightsMill'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-4fmaps'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-4vnniw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512er'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512pf'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ss'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='KnightsMill-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-4fmaps'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-4vnniw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512er'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512pf'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ss'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Opteron_G4'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fma4'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xop'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Opteron_G4-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fma4'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xop'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Opteron_G5'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fma4'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='tbm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xop'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Opteron_G5-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fma4'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='tbm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xop'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='SapphireRapids'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-int8'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-tile'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-fp16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='bus-lock-detect'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrc'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrs'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fzrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='serialize'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='tsx-ldtrk'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xfd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='SapphireRapids-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-int8'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-tile'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-fp16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='bus-lock-detect'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrc'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrs'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fzrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='serialize'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='tsx-ldtrk'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xfd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='SapphireRapids-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-int8'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-tile'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-fp16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='bus-lock-detect'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fbsdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrc'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrs'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fzrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='psdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='sbdr-ssdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='serialize'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='tsx-ldtrk'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xfd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='SapphireRapids-v3'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-int8'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-tile'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-fp16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='bus-lock-detect'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='cldemote'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fbsdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrc'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrs'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fzrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdir64b'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdiri'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='psdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='sbdr-ssdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='serialize'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ss'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='tsx-ldtrk'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xfd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='SierraForest'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-ne-convert'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-vnni-int8'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='bus-lock-detect'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='cmpccxadd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fbsdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrs'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='mcdt-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pbrsb-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='psdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='sbdr-ssdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='serialize'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='SierraForest-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-ne-convert'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-vnni-int8'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='bus-lock-detect'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='cmpccxadd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fbsdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrs'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='mcdt-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pbrsb-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='psdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='sbdr-ssdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='serialize'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Client'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Client-IBRS'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Client-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Client-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Client-v3'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Client-v4'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Server'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Server-IBRS'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Server-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Server-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Server-v3'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Server-v4'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Server-v5'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Snowridge'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='cldemote'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='core-capability'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdir64b'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdiri'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='mpx'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='split-lock-detect'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Snowridge-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='cldemote'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='core-capability'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdir64b'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdiri'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='mpx'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='split-lock-detect'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Snowridge-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='cldemote'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='core-capability'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdir64b'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdiri'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='split-lock-detect'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Snowridge-v3'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='cldemote'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='core-capability'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdir64b'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdiri'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='split-lock-detect'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Snowridge-v4'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='cldemote'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdir64b'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdiri'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='athlon'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='3dnow'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='3dnowext'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='athlon-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='3dnow'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='3dnowext'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='core2duo'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ss'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='core2duo-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ss'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='coreduo'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ss'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='coreduo-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ss'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='n270'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ss'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='n270-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ss'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='phenom'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='3dnow'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='3dnowext'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='phenom-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='3dnow'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='3dnowext'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </mode>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   </cpu>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   <memoryBacking supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <enum name='sourceType'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <value>file</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <value>anonymous</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <value>memfd</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   </memoryBacking>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   <devices>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <disk supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='diskDevice'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>disk</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>cdrom</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>floppy</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>lun</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='bus'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>ide</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>fdc</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>scsi</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>virtio</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>usb</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>sata</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='model'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>virtio</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>virtio-transitional</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>virtio-non-transitional</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </disk>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <graphics supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='type'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>vnc</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>egl-headless</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>dbus</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </graphics>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <video supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='modelType'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>vga</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>cirrus</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>virtio</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>none</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>bochs</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>ramfb</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </video>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <hostdev supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='mode'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>subsystem</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='startupPolicy'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>default</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>mandatory</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>requisite</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>optional</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='subsysType'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>usb</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>pci</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>scsi</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='capsType'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='pciBackend'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </hostdev>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <rng supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='model'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>virtio</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>virtio-transitional</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>virtio-non-transitional</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='backendModel'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>random</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>egd</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>builtin</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </rng>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <filesystem supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='driverType'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>path</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>handle</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>virtiofs</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </filesystem>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <tpm supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='model'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>tpm-tis</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>tpm-crb</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='backendModel'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>emulator</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>external</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='backendVersion'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>2.0</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </tpm>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <redirdev supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='bus'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>usb</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </redirdev>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <channel supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='type'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>pty</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>unix</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </channel>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <crypto supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='model'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='type'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>qemu</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='backendModel'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>builtin</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </crypto>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <interface supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='backendType'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>default</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>passt</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </interface>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <panic supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='model'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>isa</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>hyperv</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </panic>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   </devices>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   <features>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <gic supported='no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <vmcoreinfo supported='yes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <genid supported='yes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <backingStoreInput supported='yes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <backup supported='yes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <async-teardown supported='yes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <ps2 supported='yes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <sev supported='no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <sgx supported='no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <hyperv supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='features'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>relaxed</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>vapic</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>spinlocks</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>vpindex</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>runtime</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>synic</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>stimer</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>reset</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>vendor_id</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>frequencies</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>reenlightenment</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>tlbflush</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>ipi</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>avic</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>emsr_bitmap</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>xmm_input</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </hyperv>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <launchSecurity supported='no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   </features>
Sep 30 07:03:25 compute-0 nova_compute[189265]: </domainCapabilities>
Sep 30 07:03:25 compute-0 nova_compute[189265]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Sep 30 07:03:25 compute-0 nova_compute[189265]: 2025-09-30 07:03:25.124 2 DEBUG nova.virt.libvirt.host [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:944
Sep 30 07:03:25 compute-0 nova_compute[189265]: 2025-09-30 07:03:25.128 2 DEBUG nova.virt.libvirt.host [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Sep 30 07:03:25 compute-0 nova_compute[189265]: <domainCapabilities>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   <path>/usr/libexec/qemu-kvm</path>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   <domain>kvm</domain>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   <machine>pc-i440fx-rhel7.6.0</machine>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   <arch>x86_64</arch>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   <vcpu max='240'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   <iothreads supported='yes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   <os supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <enum name='firmware'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <loader supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='type'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>rom</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>pflash</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='readonly'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>yes</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>no</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='secure'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>no</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </loader>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   </os>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   <cpu>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <mode name='host-passthrough' supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='hostPassthroughMigratable'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>on</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>off</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </mode>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <mode name='maximum' supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='maximumMigratable'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>on</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>off</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </mode>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <mode name='host-model' supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model fallback='forbid'>EPYC-Rome</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <vendor>AMD</vendor>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <maxphysaddr mode='passthrough' limit='40'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='x2apic'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='tsc-deadline'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='hypervisor'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='tsc_adjust'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='spec-ctrl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='stibp'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='arch-capabilities'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='ssbd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='cmp_legacy'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='overflow-recov'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='succor'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='ibrs'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='amd-ssbd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='virt-ssbd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='lbrv'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='tsc-scale'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='vmcb-clean'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='flushbyasid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='pause-filter'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='pfthreshold'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='svme-addr-chk'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='lfence-always-serializing'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='rdctl-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='mds-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='pschange-mc-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='gds-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='rfds-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='disable' name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </mode>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <mode name='custom' supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Broadwell'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Broadwell-IBRS'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Broadwell-noTSX'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Broadwell-noTSX-IBRS'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Broadwell-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Broadwell-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Broadwell-v3'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Broadwell-v4'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Cascadelake-Server'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Cascadelake-Server-noTSX'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Cascadelake-Server-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Cascadelake-Server-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Cascadelake-Server-v3'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Cascadelake-Server-v4'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Cascadelake-Server-v5'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Cooperlake'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Cooperlake-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Cooperlake-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Denverton'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='mpx'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Denverton-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='mpx'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Denverton-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Denverton-v3'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Dhyana-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='EPYC-Genoa'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amd-psfd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='auto-ibrs'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='no-nested-data-bp'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='null-sel-clr-base'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='stibp-always-on'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='EPYC-Genoa-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amd-psfd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='auto-ibrs'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='no-nested-data-bp'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='null-sel-clr-base'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='stibp-always-on'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='EPYC-Milan'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='EPYC-Milan-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='EPYC-Milan-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amd-psfd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='no-nested-data-bp'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='null-sel-clr-base'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='stibp-always-on'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='EPYC-Rome'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='EPYC-Rome-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='EPYC-Rome-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='EPYC-Rome-v3'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='EPYC-v3'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='EPYC-v4'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='GraniteRapids'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-fp16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-int8'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-tile'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-fp16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='bus-lock-detect'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fbsdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrc'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrs'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fzrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='mcdt-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pbrsb-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='prefetchiti'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='psdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='sbdr-ssdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='serialize'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='tsx-ldtrk'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xfd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='GraniteRapids-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-fp16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-int8'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-tile'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-fp16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='bus-lock-detect'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fbsdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrc'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrs'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fzrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='mcdt-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pbrsb-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='prefetchiti'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='psdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='sbdr-ssdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='serialize'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='tsx-ldtrk'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xfd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='GraniteRapids-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-fp16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-int8'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-tile'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx10'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx10-128'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx10-256'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx10-512'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-fp16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='bus-lock-detect'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='cldemote'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fbsdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrc'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrs'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fzrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='mcdt-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdir64b'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdiri'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pbrsb-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='prefetchiti'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='psdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='sbdr-ssdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='serialize'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ss'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='tsx-ldtrk'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xfd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Haswell'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Haswell-IBRS'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Haswell-noTSX'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Haswell-noTSX-IBRS'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Haswell-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Haswell-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Haswell-v3'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Haswell-v4'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Icelake-Server'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Icelake-Server-noTSX'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Icelake-Server-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Icelake-Server-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Icelake-Server-v3'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Icelake-Server-v4'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Icelake-Server-v5'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Icelake-Server-v6'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Icelake-Server-v7'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='IvyBridge'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='IvyBridge-IBRS'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='IvyBridge-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='IvyBridge-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='KnightsMill'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-4fmaps'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-4vnniw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512er'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512pf'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ss'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='KnightsMill-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-4fmaps'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-4vnniw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512er'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512pf'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ss'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Opteron_G4'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fma4'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xop'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Opteron_G4-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fma4'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xop'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Opteron_G5'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fma4'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='tbm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xop'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Opteron_G5-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fma4'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='tbm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xop'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='SapphireRapids'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-int8'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-tile'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-fp16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='bus-lock-detect'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrc'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrs'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fzrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='serialize'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='tsx-ldtrk'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xfd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='SapphireRapids-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-int8'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-tile'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-fp16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='bus-lock-detect'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrc'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrs'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fzrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='serialize'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='tsx-ldtrk'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xfd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='SapphireRapids-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-int8'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-tile'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-fp16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='bus-lock-detect'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fbsdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrc'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrs'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fzrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='psdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='sbdr-ssdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='serialize'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='tsx-ldtrk'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xfd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='SapphireRapids-v3'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-int8'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-tile'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-fp16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='bus-lock-detect'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='cldemote'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fbsdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrc'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrs'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fzrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdir64b'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdiri'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='psdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='sbdr-ssdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='serialize'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ss'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='tsx-ldtrk'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xfd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='SierraForest'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-ne-convert'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-vnni-int8'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='bus-lock-detect'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='cmpccxadd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fbsdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrs'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='mcdt-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pbrsb-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='psdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='sbdr-ssdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='serialize'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='SierraForest-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-ne-convert'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-vnni-int8'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='bus-lock-detect'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='cmpccxadd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fbsdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrs'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='mcdt-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pbrsb-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='psdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='sbdr-ssdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='serialize'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Client'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Client-IBRS'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Client-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Client-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Client-v3'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Client-v4'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Server'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Server-IBRS'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Server-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Server-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Server-v3'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Server-v4'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Server-v5'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Snowridge'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='cldemote'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='core-capability'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdir64b'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdiri'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='mpx'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='split-lock-detect'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Snowridge-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='cldemote'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='core-capability'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdir64b'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdiri'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='mpx'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='split-lock-detect'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Snowridge-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='cldemote'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='core-capability'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdir64b'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdiri'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='split-lock-detect'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Snowridge-v3'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='cldemote'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='core-capability'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdir64b'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdiri'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='split-lock-detect'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Snowridge-v4'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='cldemote'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdir64b'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdiri'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='athlon'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='3dnow'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='3dnowext'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='athlon-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='3dnow'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='3dnowext'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='core2duo'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ss'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='core2duo-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ss'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='coreduo'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ss'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='coreduo-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ss'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='n270'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ss'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='n270-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ss'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='phenom'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='3dnow'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='3dnowext'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='phenom-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='3dnow'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='3dnowext'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </mode>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   </cpu>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   <memoryBacking supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <enum name='sourceType'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <value>file</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <value>anonymous</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <value>memfd</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   </memoryBacking>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   <devices>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <disk supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='diskDevice'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>disk</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>cdrom</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>floppy</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>lun</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='bus'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>ide</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>fdc</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>scsi</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>virtio</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>usb</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>sata</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='model'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>virtio</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>virtio-transitional</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>virtio-non-transitional</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </disk>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <graphics supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='type'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>vnc</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>egl-headless</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>dbus</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </graphics>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <video supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='modelType'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>vga</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>cirrus</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>virtio</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>none</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>bochs</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>ramfb</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </video>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <hostdev supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='mode'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>subsystem</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='startupPolicy'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>default</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>mandatory</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>requisite</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>optional</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='subsysType'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>usb</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>pci</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>scsi</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='capsType'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='pciBackend'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </hostdev>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <rng supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='model'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>virtio</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>virtio-transitional</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>virtio-non-transitional</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='backendModel'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>random</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>egd</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>builtin</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </rng>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <filesystem supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='driverType'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>path</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>handle</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>virtiofs</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </filesystem>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <tpm supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='model'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>tpm-tis</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>tpm-crb</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='backendModel'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>emulator</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>external</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='backendVersion'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>2.0</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </tpm>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <redirdev supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='bus'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>usb</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </redirdev>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <channel supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='type'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>pty</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>unix</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </channel>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <crypto supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='model'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='type'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>qemu</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='backendModel'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>builtin</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </crypto>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <interface supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='backendType'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>default</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>passt</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </interface>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <panic supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='model'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>isa</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>hyperv</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </panic>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   </devices>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   <features>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <gic supported='no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <vmcoreinfo supported='yes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <genid supported='yes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <backingStoreInput supported='yes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <backup supported='yes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <async-teardown supported='yes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <ps2 supported='yes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <sev supported='no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <sgx supported='no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <hyperv supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='features'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>relaxed</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>vapic</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>spinlocks</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>vpindex</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>runtime</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>synic</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>stimer</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>reset</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>vendor_id</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>frequencies</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>reenlightenment</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>tlbflush</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>ipi</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>avic</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>emsr_bitmap</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>xmm_input</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </hyperv>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <launchSecurity supported='no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   </features>
Sep 30 07:03:25 compute-0 nova_compute[189265]: </domainCapabilities>
Sep 30 07:03:25 compute-0 nova_compute[189265]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Sep 30 07:03:25 compute-0 nova_compute[189265]: 2025-09-30 07:03:25.179 2 DEBUG nova.virt.libvirt.host [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Sep 30 07:03:25 compute-0 nova_compute[189265]: <domainCapabilities>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   <path>/usr/libexec/qemu-kvm</path>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   <domain>kvm</domain>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   <machine>pc-q35-rhel9.6.0</machine>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   <arch>x86_64</arch>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   <vcpu max='4096'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   <iothreads supported='yes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   <os supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <enum name='firmware'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <value>efi</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <loader supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='type'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>rom</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>pflash</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='readonly'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>yes</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>no</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='secure'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>yes</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>no</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </loader>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   </os>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   <cpu>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <mode name='host-passthrough' supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='hostPassthroughMigratable'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>on</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>off</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </mode>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <mode name='maximum' supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='maximumMigratable'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>on</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>off</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </mode>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <mode name='host-model' supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model fallback='forbid'>EPYC-Rome</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <vendor>AMD</vendor>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <maxphysaddr mode='passthrough' limit='40'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='x2apic'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='tsc-deadline'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='hypervisor'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='tsc_adjust'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='spec-ctrl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='stibp'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='arch-capabilities'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='ssbd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='cmp_legacy'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='overflow-recov'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='succor'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='ibrs'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='amd-ssbd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='virt-ssbd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='lbrv'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='tsc-scale'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='vmcb-clean'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='flushbyasid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='pause-filter'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='pfthreshold'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='svme-addr-chk'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='lfence-always-serializing'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='rdctl-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='skip-l1dfl-vmentry'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='mds-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='pschange-mc-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='gds-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='require' name='rfds-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <feature policy='disable' name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </mode>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <mode name='custom' supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Broadwell'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Broadwell-IBRS'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Broadwell-noTSX'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Broadwell-noTSX-IBRS'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Broadwell-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Broadwell-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Broadwell-v3'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Broadwell-v4'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Cascadelake-Server'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Cascadelake-Server-noTSX'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Cascadelake-Server-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Cascadelake-Server-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Cascadelake-Server-v3'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Cascadelake-Server-v4'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Cascadelake-Server-v5'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Cooperlake'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Cooperlake-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Cooperlake-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Denverton'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='mpx'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Denverton-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 systemd[1]: Starting libvirt nodedev daemon...
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='mpx'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Denverton-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Denverton-v3'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Dhyana-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='EPYC-Genoa'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amd-psfd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='auto-ibrs'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='no-nested-data-bp'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='null-sel-clr-base'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='stibp-always-on'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='EPYC-Genoa-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amd-psfd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='auto-ibrs'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='no-nested-data-bp'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='null-sel-clr-base'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='stibp-always-on'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='EPYC-Milan'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='EPYC-Milan-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='EPYC-Milan-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amd-psfd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='no-nested-data-bp'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='null-sel-clr-base'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='stibp-always-on'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='EPYC-Rome'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='EPYC-Rome-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='EPYC-Rome-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='EPYC-Rome-v3'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='EPYC-v3'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='EPYC-v4'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='GraniteRapids'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-fp16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-int8'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-tile'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-fp16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='bus-lock-detect'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fbsdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrc'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrs'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fzrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='mcdt-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pbrsb-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='prefetchiti'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='psdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='sbdr-ssdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='serialize'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='tsx-ldtrk'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xfd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='GraniteRapids-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-fp16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-int8'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-tile'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-fp16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='bus-lock-detect'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fbsdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrc'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrs'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fzrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='mcdt-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pbrsb-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='prefetchiti'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='psdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='sbdr-ssdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='serialize'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='tsx-ldtrk'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xfd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='GraniteRapids-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-fp16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-int8'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-tile'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx10'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx10-128'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx10-256'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx10-512'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-fp16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='bus-lock-detect'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='cldemote'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fbsdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrc'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrs'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fzrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='mcdt-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdir64b'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdiri'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pbrsb-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='prefetchiti'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='psdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='sbdr-ssdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='serialize'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ss'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='tsx-ldtrk'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xfd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Haswell'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Haswell-IBRS'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Haswell-noTSX'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Haswell-noTSX-IBRS'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Haswell-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Haswell-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Haswell-v3'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Haswell-v4'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Icelake-Server'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Icelake-Server-noTSX'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Icelake-Server-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Icelake-Server-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Icelake-Server-v3'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Icelake-Server-v4'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Icelake-Server-v5'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Icelake-Server-v6'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Icelake-Server-v7'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='IvyBridge'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='IvyBridge-IBRS'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='IvyBridge-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='IvyBridge-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='KnightsMill'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-4fmaps'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-4vnniw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512er'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512pf'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ss'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='KnightsMill-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-4fmaps'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-4vnniw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512er'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512pf'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ss'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Opteron_G4'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fma4'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xop'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Opteron_G4-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fma4'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xop'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Opteron_G5'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fma4'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='tbm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xop'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Opteron_G5-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fma4'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='tbm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xop'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='SapphireRapids'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-int8'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-tile'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-fp16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='bus-lock-detect'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrc'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrs'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fzrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='serialize'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='tsx-ldtrk'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xfd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='SapphireRapids-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-int8'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-tile'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-fp16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='bus-lock-detect'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrc'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrs'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fzrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='serialize'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='tsx-ldtrk'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xfd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='SapphireRapids-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-int8'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-tile'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-fp16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 systemd[1]: Started libvirt nodedev daemon.
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='bus-lock-detect'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fbsdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrc'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrs'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fzrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='psdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='sbdr-ssdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='serialize'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='tsx-ldtrk'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xfd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='SapphireRapids-v3'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-int8'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='amx-tile'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-bf16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-fp16'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512-vpopcntdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bitalg'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vbmi2'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='bus-lock-detect'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='cldemote'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fbsdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrc'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrs'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fzrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='la57'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdir64b'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdiri'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='psdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='sbdr-ssdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='serialize'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ss'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='taa-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='tsx-ldtrk'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xfd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='SierraForest'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-ne-convert'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-vnni-int8'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='bus-lock-detect'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='cmpccxadd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fbsdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrs'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='mcdt-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pbrsb-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='psdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='sbdr-ssdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='serialize'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='SierraForest-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-ifma'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-ne-convert'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-vnni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx-vnni-int8'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='bus-lock-detect'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='cmpccxadd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fbsdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='fsrs'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ibrs-all'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='mcdt-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pbrsb-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='psdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='sbdr-ssdp-no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='serialize'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vaes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='vpclmulqdq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Client'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Client-IBRS'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Client-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Client-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Client-v3'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Client-v4'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Server'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Server-IBRS'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Server-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Server-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='hle'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='rtm'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Server-v3'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Server-v4'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Skylake-Server-v5'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512bw'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512cd'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512dq'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512f'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='avx512vl'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='invpcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pcid'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='pku'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Snowridge'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='cldemote'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='core-capability'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdir64b'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdiri'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='mpx'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='split-lock-detect'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Snowridge-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='cldemote'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='core-capability'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdir64b'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdiri'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='mpx'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='split-lock-detect'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Snowridge-v2'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='cldemote'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='core-capability'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdir64b'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdiri'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='split-lock-detect'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Snowridge-v3'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='cldemote'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='core-capability'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdir64b'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdiri'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='split-lock-detect'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='Snowridge-v4'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='cldemote'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='erms'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='gfni'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdir64b'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='movdiri'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='xsaves'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='athlon'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='3dnow'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='3dnowext'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='athlon-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='3dnow'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='3dnowext'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='core2duo'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ss'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='core2duo-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ss'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='coreduo'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ss'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='coreduo-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ss'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='n270'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ss'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='n270-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='ss'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='phenom'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='3dnow'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='3dnowext'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <blockers model='phenom-v1'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='3dnow'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <feature name='3dnowext'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </blockers>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </mode>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   </cpu>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   <memoryBacking supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <enum name='sourceType'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <value>file</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <value>anonymous</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <value>memfd</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   </memoryBacking>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   <devices>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <disk supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='diskDevice'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>disk</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>cdrom</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>floppy</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>lun</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='bus'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>fdc</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>scsi</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>virtio</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>usb</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>sata</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='model'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>virtio</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>virtio-transitional</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>virtio-non-transitional</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </disk>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <graphics supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='type'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>vnc</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>egl-headless</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>dbus</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </graphics>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <video supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='modelType'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>vga</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>cirrus</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>virtio</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>none</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>bochs</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>ramfb</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </video>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <hostdev supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='mode'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>subsystem</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='startupPolicy'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>default</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>mandatory</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>requisite</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>optional</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='subsysType'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>usb</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>pci</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>scsi</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='capsType'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='pciBackend'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </hostdev>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <rng supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='model'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>virtio</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>virtio-transitional</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>virtio-non-transitional</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='backendModel'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>random</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>egd</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>builtin</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </rng>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <filesystem supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='driverType'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>path</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>handle</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>virtiofs</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </filesystem>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <tpm supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='model'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>tpm-tis</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>tpm-crb</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='backendModel'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>emulator</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>external</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='backendVersion'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>2.0</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </tpm>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <redirdev supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='bus'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>usb</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </redirdev>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <channel supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='type'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>pty</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>unix</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </channel>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <crypto supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='model'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='type'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>qemu</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='backendModel'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>builtin</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </crypto>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <interface supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='backendType'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>default</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>passt</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </interface>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <panic supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='model'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>isa</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>hyperv</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </panic>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   </devices>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   <features>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <gic supported='no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <vmcoreinfo supported='yes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <genid supported='yes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <backingStoreInput supported='yes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <backup supported='yes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <async-teardown supported='yes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <ps2 supported='yes'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <sev supported='no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <sgx supported='no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <hyperv supported='yes'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       <enum name='features'>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>relaxed</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>vapic</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>spinlocks</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>vpindex</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>runtime</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>synic</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>stimer</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>reset</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>vendor_id</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>frequencies</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>reenlightenment</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>tlbflush</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>ipi</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>avic</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>emsr_bitmap</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:         <value>xmm_input</value>
Sep 30 07:03:25 compute-0 nova_compute[189265]:       </enum>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     </hyperv>
Sep 30 07:03:25 compute-0 nova_compute[189265]:     <launchSecurity supported='no'/>
Sep 30 07:03:25 compute-0 nova_compute[189265]:   </features>
Sep 30 07:03:25 compute-0 nova_compute[189265]: </domainCapabilities>
Sep 30 07:03:25 compute-0 nova_compute[189265]:  _get_domain_capabilities /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1029
Sep 30 07:03:25 compute-0 nova_compute[189265]: 2025-09-30 07:03:25.229 2 DEBUG nova.virt.libvirt.host [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1877
Sep 30 07:03:25 compute-0 nova_compute[189265]: 2025-09-30 07:03:25.229 2 DEBUG nova.virt.libvirt.host [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1877
Sep 30 07:03:25 compute-0 nova_compute[189265]: 2025-09-30 07:03:25.229 2 DEBUG nova.virt.libvirt.host [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1877
Sep 30 07:03:25 compute-0 nova_compute[189265]: 2025-09-30 07:03:25.229 2 INFO nova.virt.libvirt.host [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] Secure Boot support detected
Sep 30 07:03:25 compute-0 nova_compute[189265]: 2025-09-30 07:03:25.234 2 INFO nova.virt.libvirt.driver [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Sep 30 07:03:25 compute-0 nova_compute[189265]: 2025-09-30 07:03:25.235 2 INFO nova.virt.libvirt.driver [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Sep 30 07:03:25 compute-0 nova_compute[189265]: 2025-09-30 07:03:25.376 2 DEBUG nova.virt.libvirt.driver [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1177
Sep 30 07:03:25 compute-0 nova_compute[189265]: 2025-09-30 07:03:25.570 2 WARNING nova.virt.libvirt.driver [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] Cannot update service status on host "compute-0.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Sep 30 07:03:25 compute-0 nova_compute[189265]: 2025-09-30 07:03:25.571 2 DEBUG nova.virt.libvirt.volume.mount [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.12/site-packages/nova/virt/libvirt/volume/mount.py:130
Sep 30 07:03:25 compute-0 nova_compute[189265]: 2025-09-30 07:03:25.945 2 INFO nova.virt.node [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] Determined node identity 15ca5e4e-ba83-43d2-ad70-d195a46df5cc from /var/lib/nova/compute_id
Sep 30 07:03:26 compute-0 nova_compute[189265]: 2025-09-30 07:03:26.455 2 WARNING nova.compute.manager [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] Compute nodes ['15ca5e4e-ba83-43d2-ad70-d195a46df5cc'] for host compute-0.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Sep 30 07:03:27 compute-0 nova_compute[189265]: 2025-09-30 07:03:27.471 2 INFO nova.compute.manager [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Sep 30 07:03:27 compute-0 podman[189636]: 2025-09-30 07:03:27.53740372 +0000 UTC m=+0.111484664 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 07:03:28 compute-0 sshd-session[189655]: Accepted publickey for zuul from 192.168.122.30 port 33836 ssh2: ECDSA SHA256:VgXY+3KEFg6ByVjpOVk/qpSKqXtLqTtx1W0gQMfs9wE
Sep 30 07:03:28 compute-0 systemd-logind[824]: New session 28 of user zuul.
Sep 30 07:03:28 compute-0 systemd[1]: Started Session 28 of User zuul.
Sep 30 07:03:28 compute-0 sshd-session[189655]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 07:03:28 compute-0 nova_compute[189265]: 2025-09-30 07:03:28.501 2 WARNING nova.compute.manager [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] No compute node record found for host compute-0.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-0.ctlplane.example.com could not be found.
Sep 30 07:03:28 compute-0 nova_compute[189265]: 2025-09-30 07:03:28.502 2 DEBUG oslo_concurrency.lockutils [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:03:28 compute-0 nova_compute[189265]: 2025-09-30 07:03:28.503 2 DEBUG oslo_concurrency.lockutils [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:03:28 compute-0 nova_compute[189265]: 2025-09-30 07:03:28.503 2 DEBUG oslo_concurrency.lockutils [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:03:28 compute-0 nova_compute[189265]: 2025-09-30 07:03:28.503 2 DEBUG nova.compute.resource_tracker [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:03:28 compute-0 nova_compute[189265]: 2025-09-30 07:03:28.716 2 WARNING nova.virt.libvirt.driver [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:03:28 compute-0 nova_compute[189265]: 2025-09-30 07:03:28.718 2 DEBUG oslo_concurrency.processutils [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:03:28 compute-0 nova_compute[189265]: 2025-09-30 07:03:28.760 2 DEBUG oslo_concurrency.processutils [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:03:28 compute-0 nova_compute[189265]: 2025-09-30 07:03:28.761 2 DEBUG nova.compute.resource_tracker [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6197MB free_disk=73.51041030883789GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:03:28 compute-0 nova_compute[189265]: 2025-09-30 07:03:28.761 2 DEBUG oslo_concurrency.lockutils [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:03:28 compute-0 nova_compute[189265]: 2025-09-30 07:03:28.761 2 DEBUG oslo_concurrency.lockutils [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:03:29 compute-0 nova_compute[189265]: 2025-09-30 07:03:29.330 2 WARNING nova.compute.resource_tracker [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] No compute node record for compute-0.ctlplane.example.com:15ca5e4e-ba83-43d2-ad70-d195a46df5cc: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 15ca5e4e-ba83-43d2-ad70-d195a46df5cc could not be found.
Sep 30 07:03:29 compute-0 python3.9[189809]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Sep 30 07:03:29 compute-0 nova_compute[189265]: 2025-09-30 07:03:29.841 2 INFO nova.compute.resource_tracker [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] Compute node record created for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com with uuid: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc
Sep 30 07:03:30 compute-0 sudo[189963]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rkcbyzrucismpkptpmzvwnmhffsrocnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215810.143474-52-276521154435880/AnsiballZ_systemd_service.py'
Sep 30 07:03:30 compute-0 sudo[189963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:03:31 compute-0 python3.9[189965]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 07:03:31 compute-0 systemd[1]: Reloading.
Sep 30 07:03:31 compute-0 systemd-rc-local-generator[189994]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 07:03:31 compute-0 systemd-sysv-generator[189997]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 07:03:31 compute-0 nova_compute[189265]: 2025-09-30 07:03:31.372 2 DEBUG nova.compute.resource_tracker [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:03:31 compute-0 nova_compute[189265]: 2025-09-30 07:03:31.374 2 DEBUG nova.compute.resource_tracker [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:03:28 up  1:01,  0 user,  load average: 0.85, 0.87, 0.67\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:03:31 compute-0 sudo[189963]: pam_unix(sudo:session): session closed for user root
Sep 30 07:03:31 compute-0 nova_compute[189265]: 2025-09-30 07:03:31.545 2 INFO nova.scheduler.client.report [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] [req-74e938aa-628c-4c81-9562-8324161dc579] Created resource provider record via placement API for resource provider with UUID 15ca5e4e-ba83-43d2-ad70-d195a46df5cc and name compute-0.ctlplane.example.com.
Sep 30 07:03:31 compute-0 nova_compute[189265]: 2025-09-30 07:03:31.579 2 DEBUG nova.virt.libvirt.host [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Sep 30 07:03:31 compute-0 nova_compute[189265]: ] _kernel_supports_amd_sev /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1953
Sep 30 07:03:31 compute-0 nova_compute[189265]: 2025-09-30 07:03:31.579 2 INFO nova.virt.libvirt.host [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] kernel doesn't support AMD SEV
Sep 30 07:03:31 compute-0 nova_compute[189265]: 2025-09-30 07:03:31.580 2 DEBUG nova.compute.provider_tree [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] Updating inventory in ProviderTree for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Sep 30 07:03:31 compute-0 nova_compute[189265]: 2025-09-30 07:03:31.580 2 DEBUG nova.virt.libvirt.driver [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 07:03:32 compute-0 nova_compute[189265]: 2025-09-30 07:03:32.187 2 DEBUG nova.scheduler.client.report [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] Updated inventory for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:975
Sep 30 07:03:32 compute-0 nova_compute[189265]: 2025-09-30 07:03:32.187 2 DEBUG nova.compute.provider_tree [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] Updating resource provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Sep 30 07:03:32 compute-0 nova_compute[189265]: 2025-09-30 07:03:32.187 2 DEBUG nova.compute.provider_tree [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] Updating inventory in ProviderTree for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Sep 30 07:03:32 compute-0 nova_compute[189265]: 2025-09-30 07:03:32.376 2 DEBUG nova.compute.provider_tree [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] Updating resource provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Sep 30 07:03:32 compute-0 python3.9[190150]: ansible-ansible.builtin.service_facts Invoked
Sep 30 07:03:32 compute-0 network[190167]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Sep 30 07:03:32 compute-0 network[190168]: 'network-scripts' will be removed from distribution in near future.
Sep 30 07:03:32 compute-0 network[190169]: It is advised to switch to 'NetworkManager' instead for network management.
Sep 30 07:03:32 compute-0 nova_compute[189265]: 2025-09-30 07:03:32.887 2 DEBUG nova.compute.resource_tracker [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:03:32 compute-0 nova_compute[189265]: 2025-09-30 07:03:32.887 2 DEBUG oslo_concurrency.lockutils [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.126s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:03:32 compute-0 nova_compute[189265]: 2025-09-30 07:03:32.887 2 DEBUG nova.service [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.12/site-packages/nova/service.py:177
Sep 30 07:03:33 compute-0 nova_compute[189265]: 2025-09-30 07:03:33.126 2 DEBUG nova.service [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.12/site-packages/nova/service.py:194
Sep 30 07:03:33 compute-0 nova_compute[189265]: 2025-09-30 07:03:33.127 2 DEBUG nova.servicegroup.drivers.db [None req-74cb1681-29e8-4372-8c49-eceb7eee8f29 - - - - - -] DB_Driver: join new ServiceGroup member compute-0.ctlplane.example.com to the compute group, service = <Service: host=compute-0.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.12/site-packages/nova/servicegroup/drivers/db.py:44
Sep 30 07:03:39 compute-0 sudo[190444]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izdbqnjbffmawjxagldmqtexolxqnjfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215819.0573149-90-76292689231903/AnsiballZ_systemd_service.py'
Sep 30 07:03:39 compute-0 sudo[190444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:03:39 compute-0 python3.9[190446]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 07:03:39 compute-0 sudo[190444]: pam_unix(sudo:session): session closed for user root
Sep 30 07:03:40 compute-0 sudo[190597]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fejpqshblquzjvgxhyvwzvpkurnczgid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215820.2611933-110-82530674412591/AnsiballZ_file.py'
Sep 30 07:03:40 compute-0 sudo[190597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:03:41 compute-0 python3.9[190599]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:03:41 compute-0 sudo[190597]: pam_unix(sudo:session): session closed for user root
Sep 30 07:03:41 compute-0 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 07:03:41 compute-0 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 07:03:41 compute-0 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 07:03:41 compute-0 sudo[190750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dundzlbvvlvcdjrgvkfecpiecnktyegx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215821.430032-126-189174697387862/AnsiballZ_file.py'
Sep 30 07:03:41 compute-0 sudo[190750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:03:42 compute-0 python3.9[190752]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:03:42 compute-0 sudo[190750]: pam_unix(sudo:session): session closed for user root
Sep 30 07:03:43 compute-0 sudo[190902]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sypzsublbhcunsvtjumvtjjfllfeikyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215822.4624171-144-34513067258082/AnsiballZ_command.py'
Sep 30 07:03:43 compute-0 sudo[190902]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:03:43 compute-0 python3.9[190904]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 07:03:43 compute-0 sudo[190902]: pam_unix(sudo:session): session closed for user root
Sep 30 07:03:44 compute-0 python3.9[191056]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Sep 30 07:03:45 compute-0 sudo[191206]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nucdekwssetdqeykxngyicizwvqokpnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215824.7438188-180-188988118635205/AnsiballZ_systemd_service.py'
Sep 30 07:03:45 compute-0 sudo[191206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:03:45 compute-0 python3.9[191208]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 07:03:45 compute-0 systemd[1]: Reloading.
Sep 30 07:03:45 compute-0 systemd-rc-local-generator[191230]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 07:03:45 compute-0 systemd-sysv-generator[191235]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 07:03:45 compute-0 sudo[191206]: pam_unix(sudo:session): session closed for user root
Sep 30 07:03:46 compute-0 sudo[191394]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccqxnwwzvgtqiswhgffwrnoebppeejrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215826.0380652-196-129726637451091/AnsiballZ_command.py'
Sep 30 07:03:46 compute-0 sudo[191394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:03:46 compute-0 python3.9[191396]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 07:03:46 compute-0 sudo[191394]: pam_unix(sudo:session): session closed for user root
Sep 30 07:03:47 compute-0 sudo[191547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ssgyhagdinbwpgahnfqckmjmapitkxfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215826.9759095-214-137249839685022/AnsiballZ_file.py'
Sep 30 07:03:47 compute-0 sudo[191547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:03:47 compute-0 python3.9[191549]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 07:03:47 compute-0 sudo[191547]: pam_unix(sudo:session): session closed for user root
Sep 30 07:03:48 compute-0 python3.9[191699]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 07:03:49 compute-0 nova_compute[189265]: 2025-09-30 07:03:49.128 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:03:49 compute-0 podman[191825]: 2025-09-30 07:03:49.474547514 +0000 UTC m=+0.077196236 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=iscsid, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930)
Sep 30 07:03:49 compute-0 python3.9[191866]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 07:03:49 compute-0 nova_compute[189265]: 2025-09-30 07:03:49.666 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:03:50 compute-0 python3.9[191992]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759215829.0487645-246-17453322546963/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=e86e0e43000ce9ccfe5aefbf8e8f2e3d15d05584 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Sep 30 07:03:51 compute-0 sudo[192142]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vubwacosktirqkjaamlaukbkmccodagj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215830.7796657-276-88192151258063/AnsiballZ_group.py'
Sep 30 07:03:51 compute-0 sudo[192142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:03:51 compute-0 python3.9[192144]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Sep 30 07:03:51 compute-0 sudo[192142]: pam_unix(sudo:session): session closed for user root
Sep 30 07:03:52 compute-0 sudo[192294]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmtvfkbzkbdqfkhvvyecgzxwlbzyzxqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215832.0984502-298-118770719260788/AnsiballZ_getent.py'
Sep 30 07:03:52 compute-0 sudo[192294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:03:52 compute-0 python3.9[192296]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Sep 30 07:03:53 compute-0 sudo[192294]: pam_unix(sudo:session): session closed for user root
Sep 30 07:03:53 compute-0 sudo[192447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gavcmnqwydzijpbtmqvmmcwjsysecims ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215833.2715645-314-26227366403485/AnsiballZ_group.py'
Sep 30 07:03:53 compute-0 sudo[192447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:03:53 compute-0 python3.9[192449]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Sep 30 07:03:53 compute-0 groupadd[192450]: group added to /etc/group: name=ceilometer, GID=42405
Sep 30 07:03:53 compute-0 groupadd[192450]: group added to /etc/gshadow: name=ceilometer
Sep 30 07:03:53 compute-0 groupadd[192450]: new group: name=ceilometer, GID=42405
Sep 30 07:03:53 compute-0 sudo[192447]: pam_unix(sudo:session): session closed for user root
Sep 30 07:03:54 compute-0 sudo[192637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wurfovmqtmgitbuoqbxsshyufjdiqvoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215834.1475463-330-33085760458310/AnsiballZ_user.py'
Sep 30 07:03:54 compute-0 sudo[192637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:03:54 compute-0 podman[192579]: 2025-09-30 07:03:54.829362199 +0000 UTC m=+0.101991064 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.4, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 07:03:54 compute-0 podman[192580]: 2025-09-30 07:03:54.902025149 +0000 UTC m=+0.176219750 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 07:03:55 compute-0 python3.9[192652]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-0 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Sep 30 07:03:55 compute-0 useradd[192658]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/0
Sep 30 07:03:55 compute-0 useradd[192658]: add 'ceilometer' to group 'libvirt'
Sep 30 07:03:55 compute-0 useradd[192658]: add 'ceilometer' to shadow group 'libvirt'
Sep 30 07:03:55 compute-0 sudo[192637]: pam_unix(sudo:session): session closed for user root
Sep 30 07:03:56 compute-0 python3.9[192814]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 07:03:57 compute-0 python3.9[192935]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1759215836.0184271-382-151136294076610/.source.conf _original_basename=ceilometer.conf follow=False checksum=f74f01c63e6cdeca5458ef9aff2a1db5d6a4e4b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:03:57 compute-0 podman[193059]: 2025-09-30 07:03:57.854496853 +0000 UTC m=+0.074510787 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 07:03:58 compute-0 python3.9[193102]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 07:03:58 compute-0 python3.9[193225]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1759215837.4496403-382-104456765925517/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:03:59 compute-0 python3.9[193375]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 07:04:00 compute-0 python3.9[193496]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1759215838.9330199-382-6557554452691/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:04:00 compute-0 python3.9[193646]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 07:04:01 compute-0 python3.9[193798]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 07:04:02 compute-0 python3.9[193950]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 07:04:02 compute-0 auditd[703]: Audit daemon rotating log files
Sep 30 07:04:03 compute-0 python3.9[194071]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759215842.0145662-500-183910178514098/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:04:03 compute-0 python3.9[194221]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 07:04:04 compute-0 python3.9[194297]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:04:05 compute-0 python3.9[194447]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 07:04:05 compute-0 python3.9[194568]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759215844.6522012-500-98921334397363/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=ffa17f3239e0dd55ed7347cc28623909421f3090 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:04:06 compute-0 python3.9[194718]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 07:04:07 compute-0 python3.9[194839]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759215846.111468-500-37378779184371/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:04:08 compute-0 python3.9[194989]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 07:04:08 compute-0 python3.9[195110]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759215847.513351-500-175505397271194/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:04:09 compute-0 python3.9[195260]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 07:04:10 compute-0 python3.9[195381]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759215849.0739539-500-8747511342765/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=6e4982940d2bfae88404914dfaf72552f6356d81 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:04:11 compute-0 python3.9[195531]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 07:04:11 compute-0 python3.9[195652]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759215850.539985-500-255817228942763/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:04:12 compute-0 python3.9[195802]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 07:04:13 compute-0 python3.9[195923]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759215852.044854-500-256193572082628/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=d474f1e4c3dbd24762592c51cbe5311f0a037273 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:04:13 compute-0 python3.9[196073]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 07:04:14 compute-0 python3.9[196194]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759215853.4248137-500-10018251847146/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=2b6bd0891e609bf38a73282f42888052b750bed6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:04:15 compute-0 python3.9[196344]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 07:04:16 compute-0 python3.9[196465]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759215854.8316212-500-130760301241859/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=e342121a88f67e2bae7ebc05d1e6d350470198a5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:04:16 compute-0 python3.9[196615]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 07:04:17 compute-0 python3.9[196736]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759215856.3435695-500-42370003277990/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:04:18 compute-0 python3.9[196886]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 07:04:19 compute-0 python3.9[196962]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/node_exporter.yaml _original_basename=node_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/node_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:04:19 compute-0 podman[197086]: 2025-09-30 07:04:19.766345263 +0000 UTC m=+0.077877586 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Sep 30 07:04:19 compute-0 python3.9[197129]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 07:04:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:04:20.517 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:04:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:04:20.519 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:04:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:04:20.519 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:04:20 compute-0 python3.9[197208]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml _original_basename=podman_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/podman_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:04:21 compute-0 python3.9[197359]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 07:04:21 compute-0 python3.9[197435]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml _original_basename=ceilometer_prom_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:04:22 compute-0 sudo[197585]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oyrqlaegngeqtvbgytfbuwjexrcesfsa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215862.2262974-878-201930271314839/AnsiballZ_file.py'
Sep 30 07:04:22 compute-0 sudo[197585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:04:22 compute-0 nova_compute[189265]: 2025-09-30 07:04:22.789 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:04:22 compute-0 nova_compute[189265]: 2025-09-30 07:04:22.790 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:04:22 compute-0 nova_compute[189265]: 2025-09-30 07:04:22.790 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:04:22 compute-0 nova_compute[189265]: 2025-09-30 07:04:22.790 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:04:22 compute-0 nova_compute[189265]: 2025-09-30 07:04:22.790 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:04:22 compute-0 nova_compute[189265]: 2025-09-30 07:04:22.791 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:04:22 compute-0 nova_compute[189265]: 2025-09-30 07:04:22.791 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:04:22 compute-0 nova_compute[189265]: 2025-09-30 07:04:22.791 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:04:22 compute-0 nova_compute[189265]: 2025-09-30 07:04:22.791 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:04:22 compute-0 python3.9[197587]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:04:22 compute-0 sudo[197585]: pam_unix(sudo:session): session closed for user root
Sep 30 07:04:23 compute-0 nova_compute[189265]: 2025-09-30 07:04:23.314 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:04:23 compute-0 nova_compute[189265]: 2025-09-30 07:04:23.315 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:04:23 compute-0 nova_compute[189265]: 2025-09-30 07:04:23.315 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:04:23 compute-0 nova_compute[189265]: 2025-09-30 07:04:23.315 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:04:23 compute-0 nova_compute[189265]: 2025-09-30 07:04:23.479 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:04:23 compute-0 nova_compute[189265]: 2025-09-30 07:04:23.480 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:04:23 compute-0 nova_compute[189265]: 2025-09-30 07:04:23.506 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.026s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:04:23 compute-0 nova_compute[189265]: 2025-09-30 07:04:23.507 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6152MB free_disk=73.51448059082031GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:04:23 compute-0 nova_compute[189265]: 2025-09-30 07:04:23.507 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:04:23 compute-0 nova_compute[189265]: 2025-09-30 07:04:23.507 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:04:23 compute-0 sudo[197738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bmhiasrnyqpbyxiljohiunfimjaasggh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215863.122604-894-109110460463869/AnsiballZ_file.py'
Sep 30 07:04:23 compute-0 sudo[197738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:04:23 compute-0 python3.9[197740]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:04:23 compute-0 sudo[197738]: pam_unix(sudo:session): session closed for user root
Sep 30 07:04:24 compute-0 sudo[197890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rokxmiyypboxitplhzkpertjxlbvtxio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215864.043769-910-181334212159154/AnsiballZ_file.py'
Sep 30 07:04:24 compute-0 sudo[197890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:04:24 compute-0 nova_compute[189265]: 2025-09-30 07:04:24.553 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:04:24 compute-0 nova_compute[189265]: 2025-09-30 07:04:24.554 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:04:23 up  1:02,  0 user,  load average: 0.83, 0.86, 0.68\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:04:24 compute-0 nova_compute[189265]: 2025-09-30 07:04:24.579 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:04:24 compute-0 python3.9[197892]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Sep 30 07:04:24 compute-0 sudo[197890]: pam_unix(sudo:session): session closed for user root
Sep 30 07:04:25 compute-0 nova_compute[189265]: 2025-09-30 07:04:25.088 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:04:25 compute-0 sudo[198069]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmddzwrsiqeekewitevaibwebtrezrdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215864.9378774-926-143814162673203/AnsiballZ_systemd_service.py'
Sep 30 07:04:25 compute-0 sudo[198069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:04:25 compute-0 podman[198016]: 2025-09-30 07:04:25.380867266 +0000 UTC m=+0.089739829 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Sep 30 07:04:25 compute-0 podman[198017]: 2025-09-30 07:04:25.458605357 +0000 UTC m=+0.166453950 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Sep 30 07:04:25 compute-0 nova_compute[189265]: 2025-09-30 07:04:25.597 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:04:25 compute-0 nova_compute[189265]: 2025-09-30 07:04:25.598 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.091s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:04:25 compute-0 python3.9[198077]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 07:04:25 compute-0 systemd[1]: Reloading.
Sep 30 07:04:25 compute-0 systemd-sysv-generator[198120]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 07:04:25 compute-0 systemd-rc-local-generator[198116]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 07:04:26 compute-0 systemd[1]: Listening on Podman API Socket.
Sep 30 07:04:26 compute-0 sudo[198069]: pam_unix(sudo:session): session closed for user root
Sep 30 07:04:26 compute-0 sudo[198276]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxvjohcsgzgfbtojgjzmxyflsikjfwlu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215866.523231-944-42357187202467/AnsiballZ_stat.py'
Sep 30 07:04:26 compute-0 sudo[198276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:04:27 compute-0 python3.9[198278]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 07:04:27 compute-0 sudo[198276]: pam_unix(sudo:session): session closed for user root
Sep 30 07:04:27 compute-0 sudo[198399]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmhajkwbralvuygydccfeobvcglwkrap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215866.523231-944-42357187202467/AnsiballZ_copy.py'
Sep 30 07:04:27 compute-0 sudo[198399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:04:27 compute-0 python3.9[198401]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759215866.523231-944-42357187202467/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 07:04:27 compute-0 sudo[198399]: pam_unix(sudo:session): session closed for user root
Sep 30 07:04:28 compute-0 podman[198402]: 2025-09-30 07:04:28.096787655 +0000 UTC m=+0.102937558 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 07:04:28 compute-0 sudo[198569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmfkrdjwyolpamkbmqwjrnznwlgclvch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215868.366606-978-40771009994724/AnsiballZ_container_config_data.py'
Sep 30 07:04:28 compute-0 sudo[198569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:04:29 compute-0 python3.9[198571]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=podman_exporter.json debug=False
Sep 30 07:04:29 compute-0 sudo[198569]: pam_unix(sudo:session): session closed for user root
Sep 30 07:04:29 compute-0 sudo[198721]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrhannafudetroizbrsqfoydxkebauwt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215869.334126-996-168390154995953/AnsiballZ_container_config_hash.py'
Sep 30 07:04:29 compute-0 sudo[198721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:04:30 compute-0 python3.9[198723]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Sep 30 07:04:30 compute-0 sudo[198721]: pam_unix(sudo:session): session closed for user root
Sep 30 07:04:31 compute-0 sudo[198873]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwcxwdyczjxjtrfvmncmacsyjlpprthb ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759215870.5125194-1016-278248982183658/AnsiballZ_edpm_container_manage.py'
Sep 30 07:04:31 compute-0 sudo[198873]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:04:31 compute-0 python3[198875]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=podman_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Sep 30 07:04:32 compute-0 podman[198888]: 2025-09-30 07:04:32.733909788 +0000 UTC m=+1.336327855 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Sep 30 07:04:32 compute-0 podman[198986]: 2025-09-30 07:04:32.906613731 +0000 UTC m=+0.058494482 container create 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 07:04:32 compute-0 podman[198986]: 2025-09-30 07:04:32.872996935 +0000 UTC m=+0.024877696 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Sep 30 07:04:32 compute-0 python3[198875]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env OS_ENDPOINT_TYPE=internal --env CONTAINER_HOST=unix:///run/podman/podman.sock --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=edpm --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Sep 30 07:04:33 compute-0 sudo[198873]: pam_unix(sudo:session): session closed for user root
Sep 30 07:04:33 compute-0 sudo[199174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzivgcrjwuuvggievxtrvpnsgknzplnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215873.34594-1032-219047455325884/AnsiballZ_stat.py'
Sep 30 07:04:33 compute-0 sudo[199174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:04:33 compute-0 python3.9[199176]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 07:04:33 compute-0 sudo[199174]: pam_unix(sudo:session): session closed for user root
Sep 30 07:04:34 compute-0 sudo[199328]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sikjxmwodlnwhbkkobmhhcapeyrnndue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215874.2973394-1050-41362477244083/AnsiballZ_file.py'
Sep 30 07:04:34 compute-0 sudo[199328]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:04:34 compute-0 python3.9[199330]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:04:34 compute-0 sudo[199328]: pam_unix(sudo:session): session closed for user root
Sep 30 07:04:35 compute-0 sudo[199479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dezuubnfwpqfoncndpunfzpdtymlmnfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215874.9023871-1050-183772006830322/AnsiballZ_copy.py'
Sep 30 07:04:35 compute-0 sudo[199479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:04:35 compute-0 python3.9[199481]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759215874.9023871-1050-183772006830322/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:04:35 compute-0 sudo[199479]: pam_unix(sudo:session): session closed for user root
Sep 30 07:04:36 compute-0 sudo[199555]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tevgyaxrtcvrlojwzxfsircdxhhfeqqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215874.9023871-1050-183772006830322/AnsiballZ_systemd.py'
Sep 30 07:04:36 compute-0 sudo[199555]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:04:36 compute-0 python3.9[199557]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 07:04:36 compute-0 systemd[1]: Reloading.
Sep 30 07:04:36 compute-0 systemd-rc-local-generator[199583]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 07:04:36 compute-0 systemd-sysv-generator[199587]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 07:04:36 compute-0 sudo[199555]: pam_unix(sudo:session): session closed for user root
Sep 30 07:04:37 compute-0 sudo[199665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcrwuneorzzvqrobcelhdurnwlyywsde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215874.9023871-1050-183772006830322/AnsiballZ_systemd.py'
Sep 30 07:04:37 compute-0 sudo[199665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:04:37 compute-0 python3.9[199667]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 07:04:37 compute-0 systemd[1]: Reloading.
Sep 30 07:04:37 compute-0 systemd-rc-local-generator[199699]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 07:04:37 compute-0 systemd-sysv-generator[199702]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 07:04:38 compute-0 systemd[1]: Starting podman_exporter container...
Sep 30 07:04:38 compute-0 systemd[1]: Started libcrun container.
Sep 30 07:04:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9c49d553dc76d337f9c0d4cf3129a84a23a53320ecdfd82ecd8432c19d8f4203/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Sep 30 07:04:38 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9c49d553dc76d337f9c0d4cf3129a84a23a53320ecdfd82ecd8432c19d8f4203/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Sep 30 07:04:38 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b.
Sep 30 07:04:38 compute-0 podman[199706]: 2025-09-30 07:04:38.209096632 +0000 UTC m=+0.141106186 container init 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 07:04:38 compute-0 podman_exporter[199722]: ts=2025-09-30T07:04:38.232Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Sep 30 07:04:38 compute-0 podman_exporter[199722]: ts=2025-09-30T07:04:38.232Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Sep 30 07:04:38 compute-0 podman_exporter[199722]: ts=2025-09-30T07:04:38.233Z caller=handler.go:94 level=info msg="enabled collectors"
Sep 30 07:04:38 compute-0 podman_exporter[199722]: ts=2025-09-30T07:04:38.233Z caller=handler.go:105 level=info collector=container
Sep 30 07:04:38 compute-0 podman[199706]: 2025-09-30 07:04:38.236945532 +0000 UTC m=+0.168955036 container start 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 07:04:38 compute-0 podman[199706]: podman_exporter
Sep 30 07:04:38 compute-0 systemd[1]: Starting Podman API Service...
Sep 30 07:04:38 compute-0 systemd[1]: Started Podman API Service.
Sep 30 07:04:38 compute-0 systemd[1]: Started podman_exporter container.
Sep 30 07:04:38 compute-0 podman[199733]: time="2025-09-30T07:04:38Z" level=info msg="/usr/bin/podman filtering at log level info"
Sep 30 07:04:38 compute-0 podman[199733]: time="2025-09-30T07:04:38Z" level=info msg="Setting parallel job count to 25"
Sep 30 07:04:38 compute-0 podman[199733]: time="2025-09-30T07:04:38Z" level=info msg="Using sqlite as database backend"
Sep 30 07:04:38 compute-0 podman[199733]: time="2025-09-30T07:04:38Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Sep 30 07:04:38 compute-0 podman[199733]: time="2025-09-30T07:04:38Z" level=info msg="Using systemd socket activation to determine API endpoint"
Sep 30 07:04:38 compute-0 podman[199733]: time="2025-09-30T07:04:38Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Sep 30 07:04:38 compute-0 sudo[199665]: pam_unix(sudo:session): session closed for user root
Sep 30 07:04:38 compute-0 podman[199733]: @ - - [30/Sep/2025:07:04:38 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Sep 30 07:04:38 compute-0 podman[199731]: 2025-09-30 07:04:38.310500725 +0000 UTC m=+0.063220057 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 07:04:38 compute-0 podman[199733]: time="2025-09-30T07:04:38Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:04:38 compute-0 systemd[1]: 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b-1207da2e90fec548.service: Main process exited, code=exited, status=1/FAILURE
Sep 30 07:04:38 compute-0 systemd[1]: 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b-1207da2e90fec548.service: Failed with result 'exit-code'.
Sep 30 07:04:38 compute-0 podman[199733]: @ - - [30/Sep/2025:07:04:38 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 16538 "" "Go-http-client/1.1"
Sep 30 07:04:38 compute-0 podman_exporter[199722]: ts=2025-09-30T07:04:38.333Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Sep 30 07:04:38 compute-0 podman_exporter[199722]: ts=2025-09-30T07:04:38.333Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Sep 30 07:04:38 compute-0 podman_exporter[199722]: ts=2025-09-30T07:04:38.334Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Sep 30 07:04:39 compute-0 sudo[199919]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkvxwvolyaujawycjvfijbpvxkxbiwvx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215878.7008145-1098-31399761638187/AnsiballZ_systemd.py'
Sep 30 07:04:39 compute-0 sudo[199919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:04:39 compute-0 python3.9[199921]: ansible-ansible.builtin.systemd Invoked with name=edpm_podman_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 07:04:39 compute-0 systemd[1]: Stopping podman_exporter container...
Sep 30 07:04:39 compute-0 podman[199733]: @ - - [30/Sep/2025:07:04:38 +0000] "GET /v4.9.3/libpod/events?filters=%7B%7D&since=&stream=true&until= HTTP/1.1" 200 0 "" "Go-http-client/1.1"
Sep 30 07:04:39 compute-0 systemd[1]: libpod-9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b.scope: Deactivated successfully.
Sep 30 07:04:39 compute-0 podman[199925]: 2025-09-30 07:04:39.520281734 +0000 UTC m=+0.065407780 container died 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 07:04:39 compute-0 systemd[1]: 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b-1207da2e90fec548.timer: Deactivated successfully.
Sep 30 07:04:39 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b.
Sep 30 07:04:39 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b-userdata-shm.mount: Deactivated successfully.
Sep 30 07:04:39 compute-0 systemd[1]: var-lib-containers-storage-overlay-9c49d553dc76d337f9c0d4cf3129a84a23a53320ecdfd82ecd8432c19d8f4203-merged.mount: Deactivated successfully.
Sep 30 07:04:39 compute-0 podman[199925]: 2025-09-30 07:04:39.738465773 +0000 UTC m=+0.283591809 container cleanup 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 07:04:39 compute-0 podman[199925]: podman_exporter
Sep 30 07:04:39 compute-0 systemd[1]: edpm_podman_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Sep 30 07:04:39 compute-0 podman[199951]: podman_exporter
Sep 30 07:04:39 compute-0 systemd[1]: edpm_podman_exporter.service: Failed with result 'exit-code'.
Sep 30 07:04:39 compute-0 systemd[1]: Stopped podman_exporter container.
Sep 30 07:04:39 compute-0 systemd[1]: Starting podman_exporter container...
Sep 30 07:04:39 compute-0 systemd[1]: Started libcrun container.
Sep 30 07:04:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9c49d553dc76d337f9c0d4cf3129a84a23a53320ecdfd82ecd8432c19d8f4203/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Sep 30 07:04:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9c49d553dc76d337f9c0d4cf3129a84a23a53320ecdfd82ecd8432c19d8f4203/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Sep 30 07:04:40 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b.
Sep 30 07:04:40 compute-0 podman[199964]: 2025-09-30 07:04:40.013464924 +0000 UTC m=+0.156151497 container init 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 07:04:40 compute-0 podman_exporter[199980]: ts=2025-09-30T07:04:40.029Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Sep 30 07:04:40 compute-0 podman_exporter[199980]: ts=2025-09-30T07:04:40.029Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Sep 30 07:04:40 compute-0 podman_exporter[199980]: ts=2025-09-30T07:04:40.029Z caller=handler.go:94 level=info msg="enabled collectors"
Sep 30 07:04:40 compute-0 podman_exporter[199980]: ts=2025-09-30T07:04:40.029Z caller=handler.go:105 level=info collector=container
Sep 30 07:04:40 compute-0 podman[199733]: @ - - [30/Sep/2025:07:04:40 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Sep 30 07:04:40 compute-0 podman[199733]: time="2025-09-30T07:04:40Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:04:40 compute-0 podman[199964]: 2025-09-30 07:04:40.044492806 +0000 UTC m=+0.187179329 container start 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 07:04:40 compute-0 podman[199964]: podman_exporter
Sep 30 07:04:40 compute-0 podman[199733]: @ - - [30/Sep/2025:07:04:40 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 16540 "" "Go-http-client/1.1"
Sep 30 07:04:40 compute-0 podman_exporter[199980]: ts=2025-09-30T07:04:40.055Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Sep 30 07:04:40 compute-0 podman_exporter[199980]: ts=2025-09-30T07:04:40.056Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Sep 30 07:04:40 compute-0 podman_exporter[199980]: ts=2025-09-30T07:04:40.056Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Sep 30 07:04:40 compute-0 systemd[1]: Started podman_exporter container.
Sep 30 07:04:40 compute-0 sudo[199919]: pam_unix(sudo:session): session closed for user root
Sep 30 07:04:40 compute-0 podman[199990]: 2025-09-30 07:04:40.139795984 +0000 UTC m=+0.082360617 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 07:04:40 compute-0 sudo[200165]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlrudenojltmrcqulbfadtlkvdqjkdxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215880.3439171-1114-229694399086018/AnsiballZ_stat.py'
Sep 30 07:04:40 compute-0 sudo[200165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:04:40 compute-0 python3.9[200167]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 07:04:40 compute-0 sudo[200165]: pam_unix(sudo:session): session closed for user root
Sep 30 07:04:41 compute-0 sudo[200288]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pzzcvwsiqbfqrhpqhqvprsmfvzibnpgu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215880.3439171-1114-229694399086018/AnsiballZ_copy.py'
Sep 30 07:04:41 compute-0 sudo[200288]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:04:41 compute-0 python3.9[200290]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759215880.3439171-1114-229694399086018/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Sep 30 07:04:41 compute-0 sudo[200288]: pam_unix(sudo:session): session closed for user root
Sep 30 07:04:42 compute-0 sudo[200440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwtkyxrujcvofiiuxotgarkmcuhgjvfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215882.0470371-1148-221068642796774/AnsiballZ_container_config_data.py'
Sep 30 07:04:42 compute-0 sudo[200440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:04:42 compute-0 python3.9[200442]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=openstack_network_exporter.json debug=False
Sep 30 07:04:42 compute-0 sudo[200440]: pam_unix(sudo:session): session closed for user root
Sep 30 07:04:43 compute-0 sudo[200592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dovzlivtnspojvfozddxtdwmbvhzsifo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215882.9803019-1166-93409282890791/AnsiballZ_container_config_hash.py'
Sep 30 07:04:43 compute-0 sudo[200592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:04:43 compute-0 python3.9[200594]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Sep 30 07:04:43 compute-0 sudo[200592]: pam_unix(sudo:session): session closed for user root
Sep 30 07:04:44 compute-0 sudo[200744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nfwrkjsexdtexjnctzoddoidmefvgsze ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759215883.893483-1186-89288732386653/AnsiballZ_edpm_container_manage.py'
Sep 30 07:04:44 compute-0 sudo[200744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:04:44 compute-0 python3[200746]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=openstack_network_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Sep 30 07:04:46 compute-0 podman[200758]: 2025-09-30 07:04:46.911515378 +0000 UTC m=+2.340550469 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Sep 30 07:04:47 compute-0 podman[200855]: 2025-09-30 07:04:47.079938287 +0000 UTC m=+0.061664052 container create e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=9.6, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vendor=Red Hat, Inc., container_name=openstack_network_exporter, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, name=ubi9-minimal, distribution-scope=public, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, vcs-type=git, release=1755695350, maintainer=Red Hat, Inc., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Sep 30 07:04:47 compute-0 podman[200855]: 2025-09-30 07:04:47.043231093 +0000 UTC m=+0.024956898 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Sep 30 07:04:47 compute-0 python3[200746]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OS_ENDPOINT_TYPE=internal --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=edpm --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Sep 30 07:04:47 compute-0 sudo[200744]: pam_unix(sudo:session): session closed for user root
Sep 30 07:04:47 compute-0 sudo[201043]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-flgmezluxwjwpxkdrdqhvmhwsteiairl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215887.4905102-1202-204475018015305/AnsiballZ_stat.py'
Sep 30 07:04:47 compute-0 sudo[201043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:04:47 compute-0 python3.9[201045]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 07:04:48 compute-0 sudo[201043]: pam_unix(sudo:session): session closed for user root
Sep 30 07:04:48 compute-0 sudo[201197]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upzikpzzwlbvourcuilgkqgtixehgtze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215888.3924158-1220-38197646327198/AnsiballZ_file.py'
Sep 30 07:04:48 compute-0 sudo[201197]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:04:48 compute-0 python3.9[201199]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:04:48 compute-0 sudo[201197]: pam_unix(sudo:session): session closed for user root
Sep 30 07:04:49 compute-0 sudo[201348]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbykwgzfoaqiqdltopxdasguvbabjljx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215889.001448-1220-69535278986622/AnsiballZ_copy.py'
Sep 30 07:04:49 compute-0 sudo[201348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:04:49 compute-0 python3.9[201350]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759215889.001448-1220-69535278986622/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:04:49 compute-0 sudo[201348]: pam_unix(sudo:session): session closed for user root
Sep 30 07:04:49 compute-0 sudo[201433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ebnahvcdcnyznthwygjqzlipzdliduoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215889.001448-1220-69535278986622/AnsiballZ_systemd.py'
Sep 30 07:04:49 compute-0 sudo[201433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:04:50 compute-0 podman[201398]: 2025-09-30 07:04:50.020285389 +0000 UTC m=+0.085296622 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=iscsid, managed_by=edpm_ansible)
Sep 30 07:04:50 compute-0 python3.9[201443]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Sep 30 07:04:50 compute-0 systemd[1]: Reloading.
Sep 30 07:04:50 compute-0 systemd-rc-local-generator[201476]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 07:04:50 compute-0 systemd-sysv-generator[201480]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 07:04:50 compute-0 sudo[201433]: pam_unix(sudo:session): session closed for user root
Sep 30 07:04:51 compute-0 sudo[201556]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qngsajwnyymeorncnhyvvjsdrztwiagd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215889.001448-1220-69535278986622/AnsiballZ_systemd.py'
Sep 30 07:04:51 compute-0 sudo[201556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:04:51 compute-0 python3.9[201558]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Sep 30 07:04:51 compute-0 systemd[1]: Reloading.
Sep 30 07:04:51 compute-0 systemd-rc-local-generator[201587]: /etc/rc.d/rc.local is not marked executable, skipping.
Sep 30 07:04:51 compute-0 systemd-sysv-generator[201591]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Sep 30 07:04:51 compute-0 systemd[1]: Starting openstack_network_exporter container...
Sep 30 07:04:51 compute-0 systemd[1]: Started libcrun container.
Sep 30 07:04:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eefe13b05583d5f7faf847a26dba88e91060422dae7a2d7d4de78f371fcbf217/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Sep 30 07:04:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eefe13b05583d5f7faf847a26dba88e91060422dae7a2d7d4de78f371fcbf217/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Sep 30 07:04:51 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eefe13b05583d5f7faf847a26dba88e91060422dae7a2d7d4de78f371fcbf217/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Sep 30 07:04:51 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70.
Sep 30 07:04:51 compute-0 podman[201598]: 2025-09-30 07:04:51.971248204 +0000 UTC m=+0.151970597 container init e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, version=9.6, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, name=ubi9-minimal, config_id=edpm, container_name=openstack_network_exporter)
Sep 30 07:04:51 compute-0 openstack_network_exporter[201614]: INFO    07:04:51 main.go:48: registering *bridge.Collector
Sep 30 07:04:51 compute-0 openstack_network_exporter[201614]: INFO    07:04:51 main.go:48: registering *coverage.Collector
Sep 30 07:04:51 compute-0 openstack_network_exporter[201614]: INFO    07:04:51 main.go:48: registering *datapath.Collector
Sep 30 07:04:51 compute-0 openstack_network_exporter[201614]: INFO    07:04:51 main.go:48: registering *iface.Collector
Sep 30 07:04:51 compute-0 openstack_network_exporter[201614]: INFO    07:04:51 main.go:48: registering *memory.Collector
Sep 30 07:04:51 compute-0 openstack_network_exporter[201614]: INFO    07:04:51 main.go:48: registering *ovnnorthd.Collector
Sep 30 07:04:51 compute-0 openstack_network_exporter[201614]: INFO    07:04:51 main.go:48: registering *ovn.Collector
Sep 30 07:04:51 compute-0 openstack_network_exporter[201614]: INFO    07:04:51 main.go:48: registering *ovsdbserver.Collector
Sep 30 07:04:51 compute-0 openstack_network_exporter[201614]: INFO    07:04:51 main.go:48: registering *pmd_perf.Collector
Sep 30 07:04:51 compute-0 openstack_network_exporter[201614]: INFO    07:04:51 main.go:48: registering *pmd_rxq.Collector
Sep 30 07:04:51 compute-0 openstack_network_exporter[201614]: INFO    07:04:51 main.go:48: registering *vswitch.Collector
Sep 30 07:04:51 compute-0 openstack_network_exporter[201614]: NOTICE  07:04:51 main.go:76: listening on https://:9105/metrics
Sep 30 07:04:52 compute-0 podman[201598]: 2025-09-30 07:04:52.002816261 +0000 UTC m=+0.183538624 container start e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.buildah.version=1.33.7, managed_by=edpm_ansible, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, release=1755695350, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, container_name=openstack_network_exporter, architecture=x86_64, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6)
Sep 30 07:04:52 compute-0 podman[201598]: openstack_network_exporter
Sep 30 07:04:52 compute-0 systemd[1]: Started openstack_network_exporter container.
Sep 30 07:04:52 compute-0 sudo[201556]: pam_unix(sudo:session): session closed for user root
Sep 30 07:04:52 compute-0 podman[201625]: 2025-09-30 07:04:52.146223292 +0000 UTC m=+0.122000877 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, distribution-scope=public, version=9.6, container_name=openstack_network_exporter, release=1755695350, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 07:04:52 compute-0 sudo[201796]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-obzfgdjbdqbwpblcxrddybezruqugdeg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215892.268769-1268-36177271068866/AnsiballZ_systemd.py'
Sep 30 07:04:52 compute-0 sudo[201796]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:04:52 compute-0 python3.9[201798]: ansible-ansible.builtin.systemd Invoked with name=edpm_openstack_network_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Sep 30 07:04:52 compute-0 systemd[1]: Stopping openstack_network_exporter container...
Sep 30 07:04:53 compute-0 systemd[1]: libpod-e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70.scope: Deactivated successfully.
Sep 30 07:04:53 compute-0 podman[201802]: 2025-09-30 07:04:53.016475396 +0000 UTC m=+0.050375718 container died e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, architecture=x86_64, config_id=edpm, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-type=git, version=9.6)
Sep 30 07:04:53 compute-0 systemd[1]: e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70-4a43f41ea9a498f5.timer: Deactivated successfully.
Sep 30 07:04:53 compute-0 systemd[1]: Stopped /usr/bin/podman healthcheck run e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70.
Sep 30 07:04:53 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70-userdata-shm.mount: Deactivated successfully.
Sep 30 07:04:53 compute-0 systemd[1]: var-lib-containers-storage-overlay-eefe13b05583d5f7faf847a26dba88e91060422dae7a2d7d4de78f371fcbf217-merged.mount: Deactivated successfully.
Sep 30 07:04:53 compute-0 podman[201802]: 2025-09-30 07:04:53.60273466 +0000 UTC m=+0.636634992 container cleanup e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal)
Sep 30 07:04:53 compute-0 podman[201802]: openstack_network_exporter
Sep 30 07:04:53 compute-0 systemd[1]: edpm_openstack_network_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Sep 30 07:04:53 compute-0 podman[201830]: openstack_network_exporter
Sep 30 07:04:53 compute-0 systemd[1]: edpm_openstack_network_exporter.service: Failed with result 'exit-code'.
Sep 30 07:04:53 compute-0 systemd[1]: Stopped openstack_network_exporter container.
Sep 30 07:04:53 compute-0 systemd[1]: Starting openstack_network_exporter container...
Sep 30 07:04:53 compute-0 systemd[1]: Started libcrun container.
Sep 30 07:04:53 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eefe13b05583d5f7faf847a26dba88e91060422dae7a2d7d4de78f371fcbf217/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Sep 30 07:04:53 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eefe13b05583d5f7faf847a26dba88e91060422dae7a2d7d4de78f371fcbf217/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Sep 30 07:04:53 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eefe13b05583d5f7faf847a26dba88e91060422dae7a2d7d4de78f371fcbf217/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Sep 30 07:04:53 compute-0 systemd[1]: Started /usr/bin/podman healthcheck run e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70.
Sep 30 07:04:53 compute-0 podman[201843]: 2025-09-30 07:04:53.837957578 +0000 UTC m=+0.116555980 container init e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, name=ubi9-minimal, release=1755695350, vcs-type=git, io.buildah.version=1.33.7, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_id=edpm, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Sep 30 07:04:53 compute-0 openstack_network_exporter[201859]: INFO    07:04:53 main.go:48: registering *bridge.Collector
Sep 30 07:04:53 compute-0 openstack_network_exporter[201859]: INFO    07:04:53 main.go:48: registering *coverage.Collector
Sep 30 07:04:53 compute-0 openstack_network_exporter[201859]: INFO    07:04:53 main.go:48: registering *datapath.Collector
Sep 30 07:04:53 compute-0 openstack_network_exporter[201859]: INFO    07:04:53 main.go:48: registering *iface.Collector
Sep 30 07:04:53 compute-0 openstack_network_exporter[201859]: INFO    07:04:53 main.go:48: registering *memory.Collector
Sep 30 07:04:53 compute-0 openstack_network_exporter[201859]: INFO    07:04:53 main.go:48: registering *ovnnorthd.Collector
Sep 30 07:04:53 compute-0 openstack_network_exporter[201859]: INFO    07:04:53 main.go:48: registering *ovn.Collector
Sep 30 07:04:53 compute-0 openstack_network_exporter[201859]: INFO    07:04:53 main.go:48: registering *ovsdbserver.Collector
Sep 30 07:04:53 compute-0 openstack_network_exporter[201859]: INFO    07:04:53 main.go:48: registering *pmd_perf.Collector
Sep 30 07:04:53 compute-0 openstack_network_exporter[201859]: INFO    07:04:53 main.go:48: registering *pmd_rxq.Collector
Sep 30 07:04:53 compute-0 openstack_network_exporter[201859]: INFO    07:04:53 main.go:48: registering *vswitch.Collector
Sep 30 07:04:53 compute-0 openstack_network_exporter[201859]: NOTICE  07:04:53 main.go:76: listening on https://:9105/metrics
Sep 30 07:04:53 compute-0 podman[201843]: 2025-09-30 07:04:53.860443644 +0000 UTC m=+0.139042036 container start e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, vcs-type=git, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.openshift.expose-services=, managed_by=edpm_ansible, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Sep 30 07:04:53 compute-0 podman[201843]: openstack_network_exporter
Sep 30 07:04:53 compute-0 systemd[1]: Started openstack_network_exporter container.
Sep 30 07:04:53 compute-0 sudo[201796]: pam_unix(sudo:session): session closed for user root
Sep 30 07:04:53 compute-0 podman[201869]: 2025-09-30 07:04:53.98282462 +0000 UTC m=+0.112902064 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, io.openshift.expose-services=, maintainer=Red Hat, Inc., release=1755695350, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, config_id=edpm, container_name=openstack_network_exporter, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vcs-type=git, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Sep 30 07:04:54 compute-0 sudo[202039]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhnaqtphxdcqqdbgolkpxlriabzaccrg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215894.2766073-1284-70794618167738/AnsiballZ_find.py'
Sep 30 07:04:54 compute-0 sudo[202039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:04:54 compute-0 python3.9[202041]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Sep 30 07:04:54 compute-0 sudo[202039]: pam_unix(sudo:session): session closed for user root
Sep 30 07:04:55 compute-0 sudo[202210]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yusuecfowapwkgvdktkqmjejidmbdfue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215895.310848-1303-38575262518238/AnsiballZ_podman_container_info.py'
Sep 30 07:04:55 compute-0 sudo[202210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:04:55 compute-0 podman[202165]: 2025-09-30 07:04:55.848499065 +0000 UTC m=+0.080239406 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd)
Sep 30 07:04:55 compute-0 podman[202166]: 2025-09-30 07:04:55.886293401 +0000 UTC m=+0.108607291 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS)
Sep 30 07:04:56 compute-0 python3.9[202219]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Sep 30 07:04:56 compute-0 sudo[202210]: pam_unix(sudo:session): session closed for user root
Sep 30 07:04:56 compute-0 sudo[202399]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-necwxrpyxggrfbiarcueuknwlzmzbijj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215896.3565533-1311-74289637857846/AnsiballZ_podman_container_exec.py'
Sep 30 07:04:56 compute-0 sudo[202399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:04:57 compute-0 python3.9[202401]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 07:04:57 compute-0 systemd[1]: Started libpod-conmon-cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc.scope.
Sep 30 07:04:57 compute-0 podman[202402]: 2025-09-30 07:04:57.281657002 +0000 UTC m=+0.077765345 container exec cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2)
Sep 30 07:04:57 compute-0 podman[202402]: 2025-09-30 07:04:57.316852293 +0000 UTC m=+0.112960626 container exec_died cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 07:04:57 compute-0 systemd[1]: libpod-conmon-cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc.scope: Deactivated successfully.
Sep 30 07:04:57 compute-0 sudo[202399]: pam_unix(sudo:session): session closed for user root
Sep 30 07:04:57 compute-0 sudo[202580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kocffzrzlxulttqenermvtgmiogpfavg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215897.5749567-1319-45435757537228/AnsiballZ_podman_container_exec.py'
Sep 30 07:04:57 compute-0 sudo[202580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:04:58 compute-0 python3.9[202582]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 07:04:58 compute-0 systemd[1]: Started libpod-conmon-cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc.scope.
Sep 30 07:04:58 compute-0 podman[202583]: 2025-09-30 07:04:58.171927931 +0000 UTC m=+0.080057981 container exec cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Sep 30 07:04:58 compute-0 podman[202583]: 2025-09-30 07:04:58.205932448 +0000 UTC m=+0.114062648 container exec_died cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 07:04:58 compute-0 sudo[202580]: pam_unix(sudo:session): session closed for user root
Sep 30 07:04:58 compute-0 systemd[1]: libpod-conmon-cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc.scope: Deactivated successfully.
Sep 30 07:04:58 compute-0 podman[202601]: 2025-09-30 07:04:58.26933761 +0000 UTC m=+0.089554644 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Sep 30 07:04:58 compute-0 sudo[202781]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyylnntttopfhvwauuolhsiehlrqistw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215898.4373908-1327-254989588882888/AnsiballZ_file.py'
Sep 30 07:04:58 compute-0 sudo[202781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:04:58 compute-0 python3.9[202783]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:04:58 compute-0 sudo[202781]: pam_unix(sudo:session): session closed for user root
Sep 30 07:04:59 compute-0 sudo[202933]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nomzsiuxkztjtvewfsgalsffgmcvxqmq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215899.2058733-1336-140678902813963/AnsiballZ_podman_container_info.py'
Sep 30 07:04:59 compute-0 sudo[202933]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:04:59 compute-0 python3.9[202935]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Sep 30 07:04:59 compute-0 sudo[202933]: pam_unix(sudo:session): session closed for user root
Sep 30 07:05:00 compute-0 sudo[203097]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-beqkcjbmzbrbachypczmphtpfvsjebxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215900.093365-1344-69093631395051/AnsiballZ_podman_container_exec.py'
Sep 30 07:05:00 compute-0 sudo[203097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:05:00 compute-0 python3.9[203099]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 07:05:00 compute-0 systemd[1]: Started libpod-conmon-586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917.scope.
Sep 30 07:05:00 compute-0 podman[203100]: 2025-09-30 07:05:00.746619797 +0000 UTC m=+0.092649663 container exec 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 07:05:00 compute-0 podman[203100]: 2025-09-30 07:05:00.782803476 +0000 UTC m=+0.128833322 container exec_died 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Sep 30 07:05:00 compute-0 systemd[1]: libpod-conmon-586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917.scope: Deactivated successfully.
Sep 30 07:05:00 compute-0 sudo[203097]: pam_unix(sudo:session): session closed for user root
Sep 30 07:05:01 compute-0 sudo[203280]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iufrtpmokoxvectbehwfzrbtrqoertwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215901.0104942-1352-99450991161999/AnsiballZ_podman_container_exec.py'
Sep 30 07:05:01 compute-0 sudo[203280]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:05:01 compute-0 python3.9[203282]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 07:05:01 compute-0 systemd[1]: Started libpod-conmon-586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917.scope.
Sep 30 07:05:01 compute-0 podman[203283]: 2025-09-30 07:05:01.640524201 +0000 UTC m=+0.089688368 container exec 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 07:05:01 compute-0 podman[203283]: 2025-09-30 07:05:01.674988341 +0000 UTC m=+0.124152548 container exec_died 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, org.label-schema.build-date=20250930, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Sep 30 07:05:01 compute-0 systemd[1]: libpod-conmon-586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917.scope: Deactivated successfully.
Sep 30 07:05:01 compute-0 sudo[203280]: pam_unix(sudo:session): session closed for user root
Sep 30 07:05:02 compute-0 sudo[203464]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwkpyjajhnsmaaoyytounfdbqomcpuxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215901.8984942-1360-188717903066362/AnsiballZ_file.py'
Sep 30 07:05:02 compute-0 sudo[203464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:05:02 compute-0 python3.9[203466]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:05:02 compute-0 sudo[203464]: pam_unix(sudo:session): session closed for user root
Sep 30 07:05:03 compute-0 sudo[203616]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jhxegrtdisetykvdcymuhotvybesimth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215902.7416143-1369-174775084476200/AnsiballZ_podman_container_info.py'
Sep 30 07:05:03 compute-0 sudo[203616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:05:03 compute-0 python3.9[203618]: ansible-containers.podman.podman_container_info Invoked with name=['iscsid'] executable=podman
Sep 30 07:05:03 compute-0 sudo[203616]: pam_unix(sudo:session): session closed for user root
Sep 30 07:05:03 compute-0 sudo[203782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-azoyfvyyotcrylcsivkwnsiazyvesqyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215903.5762963-1377-45140306168158/AnsiballZ_podman_container_exec.py'
Sep 30 07:05:03 compute-0 sudo[203782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:05:04 compute-0 python3.9[203784]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=iscsid detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 07:05:04 compute-0 systemd[1]: Started libpod-conmon-d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b.scope.
Sep 30 07:05:04 compute-0 podman[203785]: 2025-09-30 07:05:04.328617285 +0000 UTC m=+0.103585518 container exec d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, config_id=iscsid, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=iscsid, tcib_managed=true)
Sep 30 07:05:04 compute-0 podman[203805]: 2025-09-30 07:05:04.394660382 +0000 UTC m=+0.053236451 container exec_died d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, org.label-schema.schema-version=1.0, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 07:05:04 compute-0 podman[203785]: 2025-09-30 07:05:04.400796848 +0000 UTC m=+0.175765061 container exec_died d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20250930, container_name=iscsid, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Sep 30 07:05:04 compute-0 systemd[1]: libpod-conmon-d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b.scope: Deactivated successfully.
Sep 30 07:05:04 compute-0 sudo[203782]: pam_unix(sudo:session): session closed for user root
Sep 30 07:05:05 compute-0 sudo[203968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfsbfvgqkkdzopjyntxrpmnratkiwnaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215904.6649773-1385-233414474603550/AnsiballZ_podman_container_exec.py'
Sep 30 07:05:05 compute-0 sudo[203968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:05:05 compute-0 python3.9[203970]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=iscsid detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 07:05:05 compute-0 systemd[1]: Started libpod-conmon-d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b.scope.
Sep 30 07:05:05 compute-0 podman[203971]: 2025-09-30 07:05:05.394622833 +0000 UTC m=+0.091480269 container exec d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Sep 30 07:05:05 compute-0 podman[203971]: 2025-09-30 07:05:05.431974817 +0000 UTC m=+0.128832233 container exec_died d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=iscsid, org.label-schema.license=GPLv2, container_name=iscsid)
Sep 30 07:05:05 compute-0 systemd[1]: libpod-conmon-d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b.scope: Deactivated successfully.
Sep 30 07:05:05 compute-0 sudo[203968]: pam_unix(sudo:session): session closed for user root
Sep 30 07:05:06 compute-0 sudo[204151]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqzbhohbtpchadvwzoudliznaxvtjjdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215905.7042365-1393-71339646730848/AnsiballZ_file.py'
Sep 30 07:05:06 compute-0 sudo[204151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:05:06 compute-0 python3.9[204153]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/iscsid recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:05:06 compute-0 sudo[204151]: pam_unix(sudo:session): session closed for user root
Sep 30 07:05:07 compute-0 sudo[204303]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iujdymtzwldiqfgyzfpgdqsdhqvskdbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215906.6537793-1402-33420062294288/AnsiballZ_podman_container_info.py'
Sep 30 07:05:07 compute-0 sudo[204303]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:05:07 compute-0 python3.9[204305]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman
Sep 30 07:05:07 compute-0 sudo[204303]: pam_unix(sudo:session): session closed for user root
Sep 30 07:05:07 compute-0 sudo[204469]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mafgxrzfycljogwwksyiuzotoetaoonu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215907.4653327-1410-271116909455526/AnsiballZ_podman_container_exec.py'
Sep 30 07:05:07 compute-0 sudo[204469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:05:08 compute-0 python3.9[204471]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 07:05:08 compute-0 systemd[1]: Started libpod-conmon-4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9.scope.
Sep 30 07:05:08 compute-0 podman[204472]: 2025-09-30 07:05:08.173910677 +0000 UTC m=+0.094998600 container exec 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 07:05:08 compute-0 podman[204472]: 2025-09-30 07:05:08.209004655 +0000 UTC m=+0.130092618 container exec_died 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Sep 30 07:05:08 compute-0 systemd[1]: libpod-conmon-4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9.scope: Deactivated successfully.
Sep 30 07:05:08 compute-0 sudo[204469]: pam_unix(sudo:session): session closed for user root
Sep 30 07:05:08 compute-0 sudo[204654]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqykfhikdqenhbcomphtabjsspdkxyza ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215908.5036223-1418-150294417078360/AnsiballZ_podman_container_exec.py'
Sep 30 07:05:08 compute-0 sudo[204654]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:05:09 compute-0 python3.9[204656]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 07:05:09 compute-0 systemd[1]: Started libpod-conmon-4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9.scope.
Sep 30 07:05:09 compute-0 podman[204657]: 2025-09-30 07:05:09.249255134 +0000 UTC m=+0.104903295 container exec 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Sep 30 07:05:09 compute-0 podman[204657]: 2025-09-30 07:05:09.283460137 +0000 UTC m=+0.139108338 container exec_died 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Sep 30 07:05:09 compute-0 systemd[1]: libpod-conmon-4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9.scope: Deactivated successfully.
Sep 30 07:05:09 compute-0 sudo[204654]: pam_unix(sudo:session): session closed for user root
Sep 30 07:05:10 compute-0 sudo[204836]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wyqliuqrufzyoflzyjvyfwzsdcammybv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215909.591122-1426-72245758580625/AnsiballZ_file.py'
Sep 30 07:05:10 compute-0 sudo[204836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:05:10 compute-0 python3.9[204838]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:05:10 compute-0 sudo[204836]: pam_unix(sudo:session): session closed for user root
Sep 30 07:05:10 compute-0 podman[204863]: 2025-09-30 07:05:10.48916783 +0000 UTC m=+0.065442102 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 07:05:10 compute-0 sudo[205013]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcamszuxefiqonpiqpchmhveiljnjajp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215910.5424087-1435-238271000404178/AnsiballZ_podman_container_info.py'
Sep 30 07:05:10 compute-0 sudo[205013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:05:11 compute-0 python3.9[205015]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Sep 30 07:05:11 compute-0 sudo[205013]: pam_unix(sudo:session): session closed for user root
Sep 30 07:05:11 compute-0 sudo[205177]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mefaessakmlhirbxgwmsvtwagfopmdei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215911.435115-1443-13256252166373/AnsiballZ_podman_container_exec.py'
Sep 30 07:05:11 compute-0 sudo[205177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:05:12 compute-0 python3.9[205179]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 07:05:12 compute-0 systemd[1]: Started libpod-conmon-9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b.scope.
Sep 30 07:05:12 compute-0 podman[205180]: 2025-09-30 07:05:12.151501111 +0000 UTC m=+0.106736318 container exec 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 07:05:12 compute-0 podman[205180]: 2025-09-30 07:05:12.187787224 +0000 UTC m=+0.143022391 container exec_died 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 07:05:12 compute-0 sudo[205177]: pam_unix(sudo:session): session closed for user root
Sep 30 07:05:12 compute-0 systemd[1]: libpod-conmon-9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b.scope: Deactivated successfully.
Sep 30 07:05:12 compute-0 sudo[205359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztrplvyokfksyigyibjbscboleemzwbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215912.4684167-1451-72814859427056/AnsiballZ_podman_container_exec.py'
Sep 30 07:05:12 compute-0 sudo[205359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:05:13 compute-0 python3.9[205361]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 07:05:13 compute-0 systemd[1]: Started libpod-conmon-9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b.scope.
Sep 30 07:05:13 compute-0 podman[205362]: 2025-09-30 07:05:13.166658899 +0000 UTC m=+0.088518244 container exec 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 07:05:13 compute-0 podman[205362]: 2025-09-30 07:05:13.204826536 +0000 UTC m=+0.126685841 container exec_died 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 07:05:13 compute-0 systemd[1]: libpod-conmon-9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b.scope: Deactivated successfully.
Sep 30 07:05:13 compute-0 sudo[205359]: pam_unix(sudo:session): session closed for user root
Sep 30 07:05:13 compute-0 sudo[205543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lskxvglmisrcgzeleilwxwdbhcddjvfs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215913.4902132-1459-36180429846139/AnsiballZ_file.py'
Sep 30 07:05:13 compute-0 sudo[205543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:05:14 compute-0 python3.9[205545]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:05:14 compute-0 sudo[205543]: pam_unix(sudo:session): session closed for user root
Sep 30 07:05:14 compute-0 sudo[205695]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wqhuitogcyxtwlniyeimtodtkbihqaok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215914.4521716-1468-160085875396192/AnsiballZ_podman_container_info.py'
Sep 30 07:05:14 compute-0 sudo[205695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:05:15 compute-0 python3.9[205697]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Sep 30 07:05:15 compute-0 sudo[205695]: pam_unix(sudo:session): session closed for user root
Sep 30 07:05:15 compute-0 sudo[205860]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gddvyopglbjysjdmbtksgipnnmokcyko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215915.3614829-1476-109951992028460/AnsiballZ_podman_container_exec.py'
Sep 30 07:05:15 compute-0 sudo[205860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:05:16 compute-0 python3.9[205862]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 07:05:16 compute-0 systemd[1]: Started libpod-conmon-e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70.scope.
Sep 30 07:05:16 compute-0 podman[205863]: 2025-09-30 07:05:16.079065258 +0000 UTC m=+0.062934190 container exec e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, build-date=2025-08-20T13:12:41, config_id=edpm, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, release=1755695350, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, name=ubi9-minimal, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64)
Sep 30 07:05:16 compute-0 podman[205863]: 2025-09-30 07:05:16.114892467 +0000 UTC m=+0.098761379 container exec_died e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, release=1755695350, config_id=edpm, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.buildah.version=1.33.7, maintainer=Red Hat, Inc.)
Sep 30 07:05:16 compute-0 sudo[205860]: pam_unix(sudo:session): session closed for user root
Sep 30 07:05:16 compute-0 systemd[1]: libpod-conmon-e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70.scope: Deactivated successfully.
Sep 30 07:05:16 compute-0 sudo[206045]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zodluzvctsbfierzdzacrgagqyhjvcmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215916.3869305-1484-140814927123925/AnsiballZ_podman_container_exec.py'
Sep 30 07:05:16 compute-0 sudo[206045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:05:17 compute-0 python3.9[206047]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Sep 30 07:05:17 compute-0 systemd[1]: Started libpod-conmon-e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70.scope.
Sep 30 07:05:17 compute-0 podman[206048]: 2025-09-30 07:05:17.149891415 +0000 UTC m=+0.113012668 container exec e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, architecture=x86_64, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, release=1755695350, version=9.6, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal)
Sep 30 07:05:17 compute-0 podman[206048]: 2025-09-30 07:05:17.188296068 +0000 UTC m=+0.151417311 container exec_died e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc.)
Sep 30 07:05:17 compute-0 systemd[1]: libpod-conmon-e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70.scope: Deactivated successfully.
Sep 30 07:05:17 compute-0 sudo[206045]: pam_unix(sudo:session): session closed for user root
Sep 30 07:05:17 compute-0 sudo[206228]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-toubxtndaehghgecyoevhfxswfpyvqov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215917.4391143-1492-56769104959464/AnsiballZ_file.py'
Sep 30 07:05:17 compute-0 sudo[206228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:05:18 compute-0 python3.9[206230]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:05:18 compute-0 sudo[206228]: pam_unix(sudo:session): session closed for user root
Sep 30 07:05:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:05:20.520 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:05:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:05:20.521 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:05:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:05:20.521 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:05:20 compute-0 podman[206255]: 2025-09-30 07:05:20.544244161 +0000 UTC m=+0.111139424 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team)
Sep 30 07:05:24 compute-0 podman[206275]: 2025-09-30 07:05:24.495748126 +0000 UTC m=+0.077871568 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, config_id=edpm, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, distribution-scope=public, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Sep 30 07:05:25 compute-0 nova_compute[189265]: 2025-09-30 07:05:25.592 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:05:25 compute-0 nova_compute[189265]: 2025-09-30 07:05:25.592 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:05:26 compute-0 nova_compute[189265]: 2025-09-30 07:05:26.106 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:05:26 compute-0 nova_compute[189265]: 2025-09-30 07:05:26.107 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:05:26 compute-0 nova_compute[189265]: 2025-09-30 07:05:26.107 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:05:26 compute-0 nova_compute[189265]: 2025-09-30 07:05:26.107 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:05:26 compute-0 nova_compute[189265]: 2025-09-30 07:05:26.107 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:05:26 compute-0 nova_compute[189265]: 2025-09-30 07:05:26.108 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:05:26 compute-0 nova_compute[189265]: 2025-09-30 07:05:26.108 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:05:26 compute-0 nova_compute[189265]: 2025-09-30 07:05:26.108 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:05:26 compute-0 podman[206296]: 2025-09-30 07:05:26.487231724 +0000 UTC m=+0.070609950 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 07:05:26 compute-0 podman[206297]: 2025-09-30 07:05:26.586188127 +0000 UTC m=+0.156078375 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 07:05:26 compute-0 nova_compute[189265]: 2025-09-30 07:05:26.623 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:05:26 compute-0 nova_compute[189265]: 2025-09-30 07:05:26.623 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:05:26 compute-0 nova_compute[189265]: 2025-09-30 07:05:26.623 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:05:26 compute-0 nova_compute[189265]: 2025-09-30 07:05:26.624 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:05:26 compute-0 nova_compute[189265]: 2025-09-30 07:05:26.775 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:05:26 compute-0 nova_compute[189265]: 2025-09-30 07:05:26.776 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:05:26 compute-0 nova_compute[189265]: 2025-09-30 07:05:26.795 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.019s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:05:26 compute-0 nova_compute[189265]: 2025-09-30 07:05:26.796 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6084MB free_disk=73.34300994873047GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:05:26 compute-0 nova_compute[189265]: 2025-09-30 07:05:26.797 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:05:26 compute-0 nova_compute[189265]: 2025-09-30 07:05:26.797 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:05:27 compute-0 nova_compute[189265]: 2025-09-30 07:05:27.850 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:05:27 compute-0 nova_compute[189265]: 2025-09-30 07:05:27.851 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:05:26 up  1:03,  0 user,  load average: 1.16, 0.98, 0.73\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:05:27 compute-0 nova_compute[189265]: 2025-09-30 07:05:27.875 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:05:28 compute-0 nova_compute[189265]: 2025-09-30 07:05:28.384 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:05:28 compute-0 podman[206343]: 2025-09-30 07:05:28.497682168 +0000 UTC m=+0.078876047 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Sep 30 07:05:28 compute-0 nova_compute[189265]: 2025-09-30 07:05:28.897 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:05:28 compute-0 nova_compute[189265]: 2025-09-30 07:05:28.897 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.100s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:05:38 compute-0 unix_chkpwd[206365]: password check failed for user (root)
Sep 30 07:05:38 compute-0 sshd-session[206363]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.103  user=root
Sep 30 07:05:39 compute-0 sudo[206491]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpievkeuhesakmncbhepbkzvsscyesne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215939.4762278-1700-150443379862311/AnsiballZ_file.py'
Sep 30 07:05:39 compute-0 sudo[206491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:05:40 compute-0 python3.9[206493]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:05:40 compute-0 sudo[206491]: pam_unix(sudo:session): session closed for user root
Sep 30 07:05:40 compute-0 sudo[206655]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wiokppiatdrcmylwobbfzzpbjpnbkpgo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215940.2901304-1716-134887733970411/AnsiballZ_stat.py'
Sep 30 07:05:40 compute-0 sudo[206655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:05:40 compute-0 podman[206617]: 2025-09-30 07:05:40.702592148 +0000 UTC m=+0.060256122 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 07:05:40 compute-0 sshd-session[206363]: Failed password for root from 193.46.255.103 port 35900 ssh2
Sep 30 07:05:40 compute-0 python3.9[206666]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 07:05:40 compute-0 sudo[206655]: pam_unix(sudo:session): session closed for user root
Sep 30 07:05:41 compute-0 unix_chkpwd[206694]: password check failed for user (root)
Sep 30 07:05:41 compute-0 sudo[206788]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jancxhmzezveyiylhtksywotikxyewfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215940.2901304-1716-134887733970411/AnsiballZ_copy.py'
Sep 30 07:05:41 compute-0 sudo[206788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:05:41 compute-0 python3.9[206790]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1759215940.2901304-1716-134887733970411/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:05:41 compute-0 sudo[206788]: pam_unix(sudo:session): session closed for user root
Sep 30 07:05:42 compute-0 sudo[206940]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fzabiujhetpedrscinadgztachbjdauh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215941.8634098-1748-111541995165914/AnsiballZ_file.py'
Sep 30 07:05:42 compute-0 sudo[206940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:05:42 compute-0 python3.9[206942]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:05:42 compute-0 sudo[206940]: pam_unix(sudo:session): session closed for user root
Sep 30 07:05:42 compute-0 sshd-session[206363]: Failed password for root from 193.46.255.103 port 35900 ssh2
Sep 30 07:05:43 compute-0 sudo[207092]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvetsjvudsjlwctzvbkgfzpzmlcebvle ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215942.7016885-1764-160433178439819/AnsiballZ_stat.py'
Sep 30 07:05:43 compute-0 sudo[207092]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:05:43 compute-0 unix_chkpwd[207095]: password check failed for user (root)
Sep 30 07:05:43 compute-0 python3.9[207094]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 07:05:43 compute-0 sudo[207092]: pam_unix(sudo:session): session closed for user root
Sep 30 07:05:43 compute-0 sudo[207171]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aoxcccslfulrystxinjkduffbaxyrabg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215942.7016885-1764-160433178439819/AnsiballZ_file.py'
Sep 30 07:05:43 compute-0 sudo[207171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:05:43 compute-0 python3.9[207173]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:05:43 compute-0 sudo[207171]: pam_unix(sudo:session): session closed for user root
Sep 30 07:05:44 compute-0 sudo[207323]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrqlpihllivbzagsjtptmsujjpotgepk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215944.0237308-1788-80366684722496/AnsiballZ_stat.py'
Sep 30 07:05:44 compute-0 sudo[207323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:05:44 compute-0 python3.9[207325]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 07:05:44 compute-0 sudo[207323]: pam_unix(sudo:session): session closed for user root
Sep 30 07:05:44 compute-0 sudo[207401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfpwlzoqnsbrpsbamfghrvrioasyetnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215944.0237308-1788-80366684722496/AnsiballZ_file.py'
Sep 30 07:05:44 compute-0 sudo[207401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:05:44 compute-0 python3.9[207403]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.edag2258 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:05:45 compute-0 sudo[207401]: pam_unix(sudo:session): session closed for user root
Sep 30 07:05:45 compute-0 sudo[207553]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbxqtbwxmskazzurnqwdtyzqtkohusha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215945.2337196-1812-263111178187387/AnsiballZ_stat.py'
Sep 30 07:05:45 compute-0 sudo[207553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:05:45 compute-0 python3.9[207555]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 07:05:45 compute-0 sshd-session[206363]: Failed password for root from 193.46.255.103 port 35900 ssh2
Sep 30 07:05:45 compute-0 sudo[207553]: pam_unix(sudo:session): session closed for user root
Sep 30 07:05:46 compute-0 sudo[207631]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jurctoapcszibgvawqnldewiflwlmmln ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215945.2337196-1812-263111178187387/AnsiballZ_file.py'
Sep 30 07:05:46 compute-0 sudo[207631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:05:46 compute-0 python3.9[207633]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:05:46 compute-0 sudo[207631]: pam_unix(sudo:session): session closed for user root
Sep 30 07:05:47 compute-0 sudo[207783]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vuvmenoezmmpuxlzautgfjftmqumddpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215946.5272534-1838-162119213701065/AnsiballZ_command.py'
Sep 30 07:05:47 compute-0 sudo[207783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:05:47 compute-0 python3.9[207785]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 07:05:47 compute-0 sshd-session[206363]: Received disconnect from 193.46.255.103 port 35900:11:  [preauth]
Sep 30 07:05:47 compute-0 sshd-session[206363]: Disconnected from authenticating user root 193.46.255.103 port 35900 [preauth]
Sep 30 07:05:47 compute-0 sshd-session[206363]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.103  user=root
Sep 30 07:05:47 compute-0 sudo[207783]: pam_unix(sudo:session): session closed for user root
Sep 30 07:05:48 compute-0 sudo[207938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njgzebggpqktkxcdnxehlqbgctxaazeg ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1759215947.5651689-1854-82533599495110/AnsiballZ_edpm_nftables_from_files.py'
Sep 30 07:05:48 compute-0 sudo[207938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:05:48 compute-0 unix_chkpwd[207941]: password check failed for user (root)
Sep 30 07:05:48 compute-0 sshd-session[207811]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.103  user=root
Sep 30 07:05:48 compute-0 python3[207940]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Sep 30 07:05:48 compute-0 sudo[207938]: pam_unix(sudo:session): session closed for user root
Sep 30 07:05:49 compute-0 sudo[208091]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ftdmmrxbqyyevqqdajgfgsqhflzkondk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215948.5611126-1870-208476966332800/AnsiballZ_stat.py'
Sep 30 07:05:49 compute-0 sudo[208091]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:05:49 compute-0 python3.9[208093]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 07:05:49 compute-0 sudo[208091]: pam_unix(sudo:session): session closed for user root
Sep 30 07:05:49 compute-0 sudo[208169]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hchhlbpmsrdtrkzznzcthqoydjzvekyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215948.5611126-1870-208476966332800/AnsiballZ_file.py'
Sep 30 07:05:49 compute-0 sudo[208169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:05:49 compute-0 python3.9[208171]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:05:49 compute-0 sudo[208169]: pam_unix(sudo:session): session closed for user root
Sep 30 07:05:50 compute-0 sshd-session[207811]: Failed password for root from 193.46.255.103 port 12042 ssh2
Sep 30 07:05:50 compute-0 unix_chkpwd[208295]: password check failed for user (root)
Sep 30 07:05:50 compute-0 sudo[208322]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qbgesmfezsqzdezwkgyaiiccfclzgjnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215949.9386773-1894-26167199144791/AnsiballZ_stat.py'
Sep 30 07:05:50 compute-0 sudo[208322]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:05:50 compute-0 python3.9[208324]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 07:05:50 compute-0 sudo[208322]: pam_unix(sudo:session): session closed for user root
Sep 30 07:05:51 compute-0 sudo[208410]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxqteltixyzjlnecfnshbtyrvkyyxrzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215949.9386773-1894-26167199144791/AnsiballZ_file.py'
Sep 30 07:05:51 compute-0 sudo[208410]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:05:51 compute-0 podman[208374]: 2025-09-30 07:05:51.039812126 +0000 UTC m=+0.102092444 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=iscsid)
Sep 30 07:05:51 compute-0 python3.9[208418]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:05:51 compute-0 sudo[208410]: pam_unix(sudo:session): session closed for user root
Sep 30 07:05:52 compute-0 sudo[208569]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdwjlclbbreertdwrbdpyxinqzhmjvuz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215951.6498122-1918-106016506035602/AnsiballZ_stat.py'
Sep 30 07:05:52 compute-0 sudo[208569]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:05:52 compute-0 sshd-session[207811]: Failed password for root from 193.46.255.103 port 12042 ssh2
Sep 30 07:05:52 compute-0 python3.9[208571]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 07:05:52 compute-0 sudo[208569]: pam_unix(sudo:session): session closed for user root
Sep 30 07:05:52 compute-0 unix_chkpwd[208597]: password check failed for user (root)
Sep 30 07:05:52 compute-0 sudo[208648]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eatacfhabrpjwcyowlegpnpuptueikbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215951.6498122-1918-106016506035602/AnsiballZ_file.py'
Sep 30 07:05:52 compute-0 sudo[208648]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:05:52 compute-0 python3.9[208650]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:05:52 compute-0 sudo[208648]: pam_unix(sudo:session): session closed for user root
Sep 30 07:05:53 compute-0 sudo[208800]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tncolqgncpdmycwdfhlwbdmmeexjxdgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215953.1295536-1942-238424235048823/AnsiballZ_stat.py'
Sep 30 07:05:53 compute-0 sudo[208800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:05:53 compute-0 python3.9[208802]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 07:05:53 compute-0 sudo[208800]: pam_unix(sudo:session): session closed for user root
Sep 30 07:05:54 compute-0 sudo[208878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtbfajrgnakjyjyhkpvjwlntaidaeovl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215953.1295536-1942-238424235048823/AnsiballZ_file.py'
Sep 30 07:05:54 compute-0 sudo[208878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:05:54 compute-0 python3.9[208880]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:05:54 compute-0 sudo[208878]: pam_unix(sudo:session): session closed for user root
Sep 30 07:05:54 compute-0 sshd-session[207811]: Failed password for root from 193.46.255.103 port 12042 ssh2
Sep 30 07:05:54 compute-0 sshd-session[207811]: Received disconnect from 193.46.255.103 port 12042:11:  [preauth]
Sep 30 07:05:54 compute-0 sshd-session[207811]: Disconnected from authenticating user root 193.46.255.103 port 12042 [preauth]
Sep 30 07:05:54 compute-0 sshd-session[207811]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.103  user=root
Sep 30 07:05:55 compute-0 sudo[209045]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yuyokbklljlhdclggtefrxnnwxixmrae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215954.5125954-1966-203810875260118/AnsiballZ_stat.py'
Sep 30 07:05:55 compute-0 sudo[209045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:05:55 compute-0 podman[209006]: 2025-09-30 07:05:55.023283279 +0000 UTC m=+0.077227530 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, name=ubi9-minimal, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, release=1755695350, vendor=Red Hat, Inc., distribution-scope=public, version=9.6, config_id=edpm, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, container_name=openstack_network_exporter, managed_by=edpm_ansible, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Sep 30 07:05:55 compute-0 python3.9[209055]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Sep 30 07:05:55 compute-0 sudo[209045]: pam_unix(sudo:session): session closed for user root
Sep 30 07:05:55 compute-0 unix_chkpwd[209081]: password check failed for user (root)
Sep 30 07:05:55 compute-0 sshd-session[208957]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.103  user=root
Sep 30 07:05:55 compute-0 sudo[209179]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sffvejknvsfmiyqjrmfhbjsustsyzvjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215954.5125954-1966-203810875260118/AnsiballZ_copy.py'
Sep 30 07:05:55 compute-0 sudo[209179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:05:56 compute-0 python3.9[209181]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759215954.5125954-1966-203810875260118/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:05:56 compute-0 sudo[209179]: pam_unix(sudo:session): session closed for user root
Sep 30 07:05:56 compute-0 sudo[209355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oyoswccfnpimqqaaqyhnupjxrmgupczr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215956.3582027-1996-200421892623100/AnsiballZ_file.py'
Sep 30 07:05:56 compute-0 sudo[209355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:05:56 compute-0 podman[209305]: 2025-09-30 07:05:56.745372309 +0000 UTC m=+0.081099152 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 07:05:56 compute-0 podman[209306]: 2025-09-30 07:05:56.814737102 +0000 UTC m=+0.145714758 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4)
Sep 30 07:05:56 compute-0 python3.9[209368]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:05:56 compute-0 sudo[209355]: pam_unix(sudo:session): session closed for user root
Sep 30 07:05:57 compute-0 sudo[209529]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dygxoeibzsgqpehvojabbjmxsaigeehi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215957.2072132-2012-254374495326419/AnsiballZ_command.py'
Sep 30 07:05:57 compute-0 sudo[209529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:05:57 compute-0 sshd-session[208957]: Failed password for root from 193.46.255.103 port 47926 ssh2
Sep 30 07:05:57 compute-0 python3.9[209531]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 07:05:57 compute-0 sudo[209529]: pam_unix(sudo:session): session closed for user root
Sep 30 07:05:58 compute-0 sudo[209684]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fddyvavjjpxonfedlsbzifhzqtyytqgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215958.0579617-2028-82403229734952/AnsiballZ_blockinfile.py'
Sep 30 07:05:58 compute-0 sudo[209684]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:05:58 compute-0 podman[209686]: 2025-09-30 07:05:58.708209644 +0000 UTC m=+0.092962152 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, tcib_managed=true, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Sep 30 07:05:58 compute-0 python3.9[209687]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:05:58 compute-0 sudo[209684]: pam_unix(sudo:session): session closed for user root
Sep 30 07:05:59 compute-0 unix_chkpwd[209855]: password check failed for user (root)
Sep 30 07:05:59 compute-0 sudo[209854]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjcchpxkzeieipquahdnvglappxegjzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215959.1848648-2046-125691054360157/AnsiballZ_command.py'
Sep 30 07:05:59 compute-0 sudo[209854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:05:59 compute-0 python3.9[209857]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 07:05:59 compute-0 sudo[209854]: pam_unix(sudo:session): session closed for user root
Sep 30 07:06:00 compute-0 sudo[210008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrqwjufeajiwvxpeideoglmqjffnqsal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215960.1130984-2062-27005631108916/AnsiballZ_stat.py'
Sep 30 07:06:00 compute-0 sudo[210008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:06:00 compute-0 python3.9[210010]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Sep 30 07:06:00 compute-0 sudo[210008]: pam_unix(sudo:session): session closed for user root
Sep 30 07:06:01 compute-0 sudo[210162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlwuoxjprgmlcvgyadslryzurcebeeyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215961.0185223-2078-27710883777836/AnsiballZ_command.py'
Sep 30 07:06:01 compute-0 sudo[210162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:06:01 compute-0 python3.9[210164]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Sep 30 07:06:01 compute-0 sudo[210162]: pam_unix(sudo:session): session closed for user root
Sep 30 07:06:01 compute-0 sshd-session[208957]: Failed password for root from 193.46.255.103 port 47926 ssh2
Sep 30 07:06:02 compute-0 sudo[210317]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vhyrlglwdvcboxworjijojmqrfvipzdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1759215961.9484677-2094-28075243641202/AnsiballZ_file.py'
Sep 30 07:06:02 compute-0 sudo[210317]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:06:02 compute-0 openstack_network_exporter[201859]: ERROR   07:06:02 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:06:02 compute-0 openstack_network_exporter[201859]: ERROR   07:06:02 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:06:02 compute-0 openstack_network_exporter[201859]: ERROR   07:06:02 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:06:02 compute-0 openstack_network_exporter[201859]: ERROR   07:06:02 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:06:02 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:06:02 compute-0 openstack_network_exporter[201859]: ERROR   07:06:02 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:06:02 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:06:02 compute-0 python3.9[210319]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Sep 30 07:06:02 compute-0 sudo[210317]: pam_unix(sudo:session): session closed for user root
Sep 30 07:06:02 compute-0 podman[199733]: time="2025-09-30T07:06:02Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:06:02 compute-0 podman[199733]: @ - - [30/Sep/2025:07:06:02 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:06:02 compute-0 podman[199733]: @ - - [30/Sep/2025:07:06:02 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2978 "" "Go-http-client/1.1"
Sep 30 07:06:02 compute-0 sshd-session[189658]: Connection closed by 192.168.122.30 port 33836
Sep 30 07:06:03 compute-0 sshd-session[189655]: pam_unix(sshd:session): session closed for user zuul
Sep 30 07:06:03 compute-0 systemd[1]: session-28.scope: Deactivated successfully.
Sep 30 07:06:03 compute-0 systemd[1]: session-28.scope: Consumed 1min 40.842s CPU time.
Sep 30 07:06:03 compute-0 systemd-logind[824]: Session 28 logged out. Waiting for processes to exit.
Sep 30 07:06:03 compute-0 systemd-logind[824]: Removed session 28.
Sep 30 07:06:03 compute-0 unix_chkpwd[210352]: password check failed for user (root)
Sep 30 07:06:05 compute-0 sshd-session[208957]: Failed password for root from 193.46.255.103 port 47926 ssh2
Sep 30 07:06:05 compute-0 sshd-session[208957]: Received disconnect from 193.46.255.103 port 47926:11:  [preauth]
Sep 30 07:06:05 compute-0 sshd-session[208957]: Disconnected from authenticating user root 193.46.255.103 port 47926 [preauth]
Sep 30 07:06:05 compute-0 sshd-session[208957]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.103  user=root
Sep 30 07:06:11 compute-0 podman[210354]: 2025-09-30 07:06:11.497946288 +0000 UTC m=+0.073127582 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 07:06:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:06:20.522 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:06:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:06:20.522 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:06:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:06:20.523 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:06:21 compute-0 podman[210379]: 2025-09-30 07:06:21.507317436 +0000 UTC m=+0.084481588 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Sep 30 07:06:25 compute-0 podman[210399]: 2025-09-30 07:06:25.482678046 +0000 UTC m=+0.067716447 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_id=edpm, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=Red Hat, Inc., name=ubi9-minimal, version=9.6, io.buildah.version=1.33.7, release=1755695350, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter)
Sep 30 07:06:27 compute-0 podman[210420]: 2025-09-30 07:06:27.526233201 +0000 UTC m=+0.100708374 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Sep 30 07:06:27 compute-0 podman[210421]: 2025-09-30 07:06:27.579897973 +0000 UTC m=+0.150246078 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.4)
Sep 30 07:06:28 compute-0 nova_compute[189265]: 2025-09-30 07:06:28.898 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:06:28 compute-0 nova_compute[189265]: 2025-09-30 07:06:28.899 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:06:28 compute-0 nova_compute[189265]: 2025-09-30 07:06:28.900 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:06:28 compute-0 nova_compute[189265]: 2025-09-30 07:06:28.900 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:06:28 compute-0 nova_compute[189265]: 2025-09-30 07:06:28.900 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:06:28 compute-0 nova_compute[189265]: 2025-09-30 07:06:28.900 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:06:28 compute-0 nova_compute[189265]: 2025-09-30 07:06:28.901 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:06:28 compute-0 nova_compute[189265]: 2025-09-30 07:06:28.901 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:06:28 compute-0 nova_compute[189265]: 2025-09-30 07:06:28.901 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:06:29 compute-0 nova_compute[189265]: 2025-09-30 07:06:29.416 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:06:29 compute-0 nova_compute[189265]: 2025-09-30 07:06:29.417 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:06:29 compute-0 nova_compute[189265]: 2025-09-30 07:06:29.417 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:06:29 compute-0 nova_compute[189265]: 2025-09-30 07:06:29.417 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:06:29 compute-0 podman[210466]: 2025-09-30 07:06:29.501976188 +0000 UTC m=+0.087358701 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20250930, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 07:06:29 compute-0 nova_compute[189265]: 2025-09-30 07:06:29.602 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:06:29 compute-0 nova_compute[189265]: 2025-09-30 07:06:29.603 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:06:29 compute-0 nova_compute[189265]: 2025-09-30 07:06:29.618 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.015s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:06:29 compute-0 nova_compute[189265]: 2025-09-30 07:06:29.619 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6121MB free_disk=73.34300231933594GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:06:29 compute-0 nova_compute[189265]: 2025-09-30 07:06:29.619 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:06:29 compute-0 nova_compute[189265]: 2025-09-30 07:06:29.619 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:06:29 compute-0 podman[199733]: time="2025-09-30T07:06:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:06:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:06:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:06:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:06:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2982 "" "Go-http-client/1.1"
Sep 30 07:06:30 compute-0 nova_compute[189265]: 2025-09-30 07:06:30.689 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:06:30 compute-0 nova_compute[189265]: 2025-09-30 07:06:30.690 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:06:29 up  1:04,  0 user,  load average: 0.57, 0.84, 0.70\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:06:30 compute-0 nova_compute[189265]: 2025-09-30 07:06:30.718 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:06:31 compute-0 nova_compute[189265]: 2025-09-30 07:06:31.247 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:06:31 compute-0 openstack_network_exporter[201859]: ERROR   07:06:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:06:31 compute-0 openstack_network_exporter[201859]: ERROR   07:06:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:06:31 compute-0 openstack_network_exporter[201859]: ERROR   07:06:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:06:31 compute-0 openstack_network_exporter[201859]: ERROR   07:06:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:06:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:06:31 compute-0 openstack_network_exporter[201859]: ERROR   07:06:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:06:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:06:31 compute-0 nova_compute[189265]: 2025-09-30 07:06:31.755 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:06:31 compute-0 nova_compute[189265]: 2025-09-30 07:06:31.756 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.137s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:06:42 compute-0 podman[210487]: 2025-09-30 07:06:42.484419973 +0000 UTC m=+0.066731639 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 07:06:47 compute-0 PackageKit[126896]: daemon quit
Sep 30 07:06:47 compute-0 systemd[1]: packagekit.service: Deactivated successfully.
Sep 30 07:06:52 compute-0 podman[210512]: 2025-09-30 07:06:52.472825721 +0000 UTC m=+0.055598409 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=iscsid)
Sep 30 07:06:56 compute-0 podman[210532]: 2025-09-30 07:06:56.461662756 +0000 UTC m=+0.052195261 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., distribution-scope=public, container_name=openstack_network_exporter, release=1755695350, vcs-type=git)
Sep 30 07:06:58 compute-0 podman[210553]: 2025-09-30 07:06:58.502046166 +0000 UTC m=+0.081554706 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 07:06:58 compute-0 podman[210554]: 2025-09-30 07:06:58.539705729 +0000 UTC m=+0.125804588 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250930, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Sep 30 07:07:00 compute-0 podman[210598]: 2025-09-30 07:07:00.500703475 +0000 UTC m=+0.077951022 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20250930)
Sep 30 07:07:10 compute-0 sshd-session[210617]: Invalid user lab from 185.156.73.233 port 48760
Sep 30 07:07:11 compute-0 sshd-session[210617]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:07:11 compute-0 sshd-session[210617]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=185.156.73.233
Sep 30 07:07:12 compute-0 sshd-session[210617]: Failed password for invalid user lab from 185.156.73.233 port 48760 ssh2
Sep 30 07:07:13 compute-0 podman[210619]: 2025-09-30 07:07:13.531253635 +0000 UTC m=+0.105328400 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 07:07:14 compute-0 sshd-session[210617]: Connection closed by invalid user lab 185.156.73.233 port 48760 [preauth]
Sep 30 07:07:15 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:07:15.296 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '1a:26:7c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2e:60:fa:91:d0:34'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:07:15 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:07:15.297 100322 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 07:07:15 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:07:15.301 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=01429670-4ea1-4dab-babc-4bc628cc01bb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:07:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:07:20.524 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:07:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:07:20.524 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:07:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:07:20.524 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:07:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:07:20.961 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:99:a6:b8 192.168.122.171'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.122.171/24', 'neutron:device_id': 'ovnmeta-72aaf760-0609-482c-9256-5ef99c67c4f7', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-72aaf760-0609-482c-9256-5ef99c67c4f7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4049964ce8244dacb50493f6676c6613', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a77c5006-5513-4d8b-b972-e1e36cbe2812, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=e29767f7-a7f5-4c30-ab53-e93d0d25696e) old=Port_Binding(mac=['fa:16:3e:99:a6:b8'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-72aaf760-0609-482c-9256-5ef99c67c4f7', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-72aaf760-0609-482c-9256-5ef99c67c4f7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4049964ce8244dacb50493f6676c6613', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:07:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:07:20.962 100322 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port e29767f7-a7f5-4c30-ab53-e93d0d25696e in datapath 72aaf760-0609-482c-9256-5ef99c67c4f7 updated
Sep 30 07:07:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:07:20.964 100322 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 72aaf760-0609-482c-9256-5ef99c67c4f7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 07:07:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:07:20.965 100322 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmp0mdaoum8/privsep.sock']
Sep 30 07:07:21 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:07:21.689 100322 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Sep 30 07:07:21 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:07:21.689 100322 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp0mdaoum8/privsep.sock __init__ /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:377
Sep 30 07:07:21 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:07:21.545 210650 INFO oslo.privsep.daemon [-] privsep daemon starting
Sep 30 07:07:21 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:07:21.552 210650 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Sep 30 07:07:21 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:07:21.556 210650 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Sep 30 07:07:21 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:07:21.557 210650 INFO oslo.privsep.daemon [-] privsep daemon running as pid 210650
Sep 30 07:07:21 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:07:21.691 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[b4a01b53-fa47-42f9-95db-f294f3376419]: (2,) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:07:22 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:07:22.162 210650 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:07:22 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:07:22.162 210650 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:07:22 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:07:22.162 210650 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:07:22 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:07:22.596 210650 INFO oslo_service.backend [-] Loading backend: eventlet
Sep 30 07:07:22 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:07:22.601 210650 INFO oslo_service.backend [-] Backend 'eventlet' successfully loaded and cached.
Sep 30 07:07:22 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:07:22.635 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[b64529aa-7e34-46d7-889d-1bcd510a217a]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:07:23 compute-0 podman[210655]: 2025-09-30 07:07:23.499723128 +0000 UTC m=+0.081910406 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 07:07:26 compute-0 nova_compute[189265]: 2025-09-30 07:07:26.640 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:07:26 compute-0 nova_compute[189265]: 2025-09-30 07:07:26.641 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:07:27 compute-0 nova_compute[189265]: 2025-09-30 07:07:27.154 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:07:27 compute-0 nova_compute[189265]: 2025-09-30 07:07:27.155 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:07:27 compute-0 nova_compute[189265]: 2025-09-30 07:07:27.155 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:07:27 compute-0 nova_compute[189265]: 2025-09-30 07:07:27.155 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:07:27 compute-0 nova_compute[189265]: 2025-09-30 07:07:27.155 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:07:27 compute-0 nova_compute[189265]: 2025-09-30 07:07:27.155 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:07:27 compute-0 nova_compute[189265]: 2025-09-30 07:07:27.156 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:07:27 compute-0 nova_compute[189265]: 2025-09-30 07:07:27.156 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:07:27 compute-0 podman[210677]: 2025-09-30 07:07:27.467775415 +0000 UTC m=+0.058970366 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, release=1755695350, com.redhat.component=ubi9-minimal-container, distribution-scope=public, version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, container_name=openstack_network_exporter, io.buildah.version=1.33.7)
Sep 30 07:07:27 compute-0 nova_compute[189265]: 2025-09-30 07:07:27.683 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:07:27 compute-0 nova_compute[189265]: 2025-09-30 07:07:27.683 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:07:27 compute-0 nova_compute[189265]: 2025-09-30 07:07:27.683 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:07:27 compute-0 nova_compute[189265]: 2025-09-30 07:07:27.683 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:07:27 compute-0 nova_compute[189265]: 2025-09-30 07:07:27.812 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:07:27 compute-0 nova_compute[189265]: 2025-09-30 07:07:27.813 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:07:27 compute-0 nova_compute[189265]: 2025-09-30 07:07:27.828 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.015s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:07:27 compute-0 nova_compute[189265]: 2025-09-30 07:07:27.829 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6074MB free_disk=73.34300231933594GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:07:27 compute-0 nova_compute[189265]: 2025-09-30 07:07:27.829 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:07:27 compute-0 nova_compute[189265]: 2025-09-30 07:07:27.829 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:07:28 compute-0 nova_compute[189265]: 2025-09-30 07:07:28.935 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:07:28 compute-0 nova_compute[189265]: 2025-09-30 07:07:28.935 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:07:27 up  1:05,  0 user,  load average: 0.23, 0.70, 0.66\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:07:28 compute-0 nova_compute[189265]: 2025-09-30 07:07:28.958 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:07:29 compute-0 nova_compute[189265]: 2025-09-30 07:07:29.465 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:07:29 compute-0 podman[210699]: 2025-09-30 07:07:29.544263393 +0000 UTC m=+0.114585365 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20250930, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 07:07:29 compute-0 podman[210700]: 2025-09-30 07:07:29.631158402 +0000 UTC m=+0.195546774 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4)
Sep 30 07:07:29 compute-0 podman[199733]: time="2025-09-30T07:07:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:07:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:07:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:07:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:07:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2979 "" "Go-http-client/1.1"
Sep 30 07:07:29 compute-0 nova_compute[189265]: 2025-09-30 07:07:29.977 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:07:29 compute-0 nova_compute[189265]: 2025-09-30 07:07:29.977 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.148s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:07:31 compute-0 openstack_network_exporter[201859]: ERROR   07:07:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:07:31 compute-0 openstack_network_exporter[201859]: ERROR   07:07:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:07:31 compute-0 openstack_network_exporter[201859]: ERROR   07:07:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:07:31 compute-0 openstack_network_exporter[201859]: ERROR   07:07:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:07:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:07:31 compute-0 openstack_network_exporter[201859]: ERROR   07:07:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:07:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:07:31 compute-0 podman[210746]: 2025-09-30 07:07:31.484536104 +0000 UTC m=+0.067196133 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_metadata_agent)
Sep 30 07:07:44 compute-0 podman[210765]: 2025-09-30 07:07:44.504879483 +0000 UTC m=+0.072392024 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 07:07:54 compute-0 podman[210789]: 2025-09-30 07:07:54.479703102 +0000 UTC m=+0.062863213 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Sep 30 07:07:58 compute-0 podman[210810]: 2025-09-30 07:07:58.481852751 +0000 UTC m=+0.063934755 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, name=ubi9-minimal, managed_by=edpm_ansible, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, release=1755695350, architecture=x86_64, maintainer=Red Hat, Inc.)
Sep 30 07:07:59 compute-0 podman[199733]: time="2025-09-30T07:07:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:07:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:07:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:07:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:07:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2983 "" "Go-http-client/1.1"
Sep 30 07:08:00 compute-0 podman[210831]: 2025-09-30 07:08:00.522039737 +0000 UTC m=+0.103874922 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20250930)
Sep 30 07:08:00 compute-0 podman[210832]: 2025-09-30 07:08:00.536521914 +0000 UTC m=+0.112974950 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Sep 30 07:08:01 compute-0 openstack_network_exporter[201859]: ERROR   07:08:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:08:01 compute-0 openstack_network_exporter[201859]: ERROR   07:08:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:08:01 compute-0 openstack_network_exporter[201859]: ERROR   07:08:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:08:01 compute-0 openstack_network_exporter[201859]: ERROR   07:08:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:08:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:08:01 compute-0 openstack_network_exporter[201859]: ERROR   07:08:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:08:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:08:02 compute-0 podman[210877]: 2025-09-30 07:08:02.475015685 +0000 UTC m=+0.058317240 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Sep 30 07:08:15 compute-0 podman[210897]: 2025-09-30 07:08:15.483174012 +0000 UTC m=+0.066251983 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 07:08:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:08:20.525 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:08:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:08:20.526 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:08:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:08:20.526 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:08:22 compute-0 nova_compute[189265]: 2025-09-30 07:08:22.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:08:22 compute-0 nova_compute[189265]: 2025-09-30 07:08:22.789 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Sep 30 07:08:23 compute-0 nova_compute[189265]: 2025-09-30 07:08:23.299 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Sep 30 07:08:23 compute-0 nova_compute[189265]: 2025-09-30 07:08:23.301 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:08:23 compute-0 nova_compute[189265]: 2025-09-30 07:08:23.301 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Sep 30 07:08:23 compute-0 nova_compute[189265]: 2025-09-30 07:08:23.819 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:08:25 compute-0 podman[210922]: 2025-09-30 07:08:25.474409434 +0000 UTC m=+0.062965226 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_id=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team)
Sep 30 07:08:28 compute-0 nova_compute[189265]: 2025-09-30 07:08:28.320 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:08:28 compute-0 nova_compute[189265]: 2025-09-30 07:08:28.320 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:08:28 compute-0 nova_compute[189265]: 2025-09-30 07:08:28.320 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:08:28 compute-0 nova_compute[189265]: 2025-09-30 07:08:28.321 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:08:28 compute-0 nova_compute[189265]: 2025-09-30 07:08:28.321 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:08:28 compute-0 nova_compute[189265]: 2025-09-30 07:08:28.321 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:08:28 compute-0 nova_compute[189265]: 2025-09-30 07:08:28.321 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:08:28 compute-0 nova_compute[189265]: 2025-09-30 07:08:28.322 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:08:28 compute-0 nova_compute[189265]: 2025-09-30 07:08:28.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:08:29 compute-0 nova_compute[189265]: 2025-09-30 07:08:29.305 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:08:29 compute-0 nova_compute[189265]: 2025-09-30 07:08:29.305 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:08:29 compute-0 nova_compute[189265]: 2025-09-30 07:08:29.306 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:08:29 compute-0 nova_compute[189265]: 2025-09-30 07:08:29.306 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:08:29 compute-0 nova_compute[189265]: 2025-09-30 07:08:29.500 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:08:29 compute-0 nova_compute[189265]: 2025-09-30 07:08:29.502 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:08:29 compute-0 podman[210943]: 2025-09-30 07:08:29.509912585 +0000 UTC m=+0.083152381 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, vcs-type=git, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.openshift.expose-services=, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Sep 30 07:08:29 compute-0 nova_compute[189265]: 2025-09-30 07:08:29.521 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.019s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:08:29 compute-0 nova_compute[189265]: 2025-09-30 07:08:29.522 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6099MB free_disk=73.34300231933594GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:08:29 compute-0 nova_compute[189265]: 2025-09-30 07:08:29.522 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:08:29 compute-0 nova_compute[189265]: 2025-09-30 07:08:29.522 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:08:29 compute-0 podman[199733]: time="2025-09-30T07:08:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:08:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:08:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:08:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:08:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2982 "" "Go-http-client/1.1"
Sep 30 07:08:30 compute-0 nova_compute[189265]: 2025-09-30 07:08:30.583 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:08:30 compute-0 nova_compute[189265]: 2025-09-30 07:08:30.583 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:08:29 up  1:06,  0 user,  load average: 0.23, 0.59, 0.62\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:08:30 compute-0 nova_compute[189265]: 2025-09-30 07:08:30.601 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:08:31 compute-0 nova_compute[189265]: 2025-09-30 07:08:31.111 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:08:31 compute-0 openstack_network_exporter[201859]: ERROR   07:08:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:08:31 compute-0 openstack_network_exporter[201859]: ERROR   07:08:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:08:31 compute-0 openstack_network_exporter[201859]: ERROR   07:08:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:08:31 compute-0 openstack_network_exporter[201859]: ERROR   07:08:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:08:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:08:31 compute-0 openstack_network_exporter[201859]: ERROR   07:08:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:08:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:08:31 compute-0 podman[210967]: 2025-09-30 07:08:31.512256858 +0000 UTC m=+0.084469181 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Sep 30 07:08:31 compute-0 podman[210968]: 2025-09-30 07:08:31.57068643 +0000 UTC m=+0.135775583 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Sep 30 07:08:31 compute-0 nova_compute[189265]: 2025-09-30 07:08:31.621 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:08:31 compute-0 nova_compute[189265]: 2025-09-30 07:08:31.621 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.099s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:08:33 compute-0 podman[211012]: 2025-09-30 07:08:33.472293463 +0000 UTC m=+0.058023891 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20250930)
Sep 30 07:08:46 compute-0 podman[211033]: 2025-09-30 07:08:46.485821468 +0000 UTC m=+0.068961388 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 07:08:56 compute-0 podman[211059]: 2025-09-30 07:08:56.504281741 +0000 UTC m=+0.085412872 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, tcib_managed=true)
Sep 30 07:08:59 compute-0 podman[199733]: time="2025-09-30T07:08:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:08:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:08:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:08:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:08:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2982 "" "Go-http-client/1.1"
Sep 30 07:09:00 compute-0 podman[211080]: 2025-09-30 07:09:00.47065061 +0000 UTC m=+0.059058494 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, container_name=openstack_network_exporter, name=ubi9-minimal, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, release=1755695350, vcs-type=git, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41)
Sep 30 07:09:01 compute-0 openstack_network_exporter[201859]: ERROR   07:09:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:09:01 compute-0 openstack_network_exporter[201859]: ERROR   07:09:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:09:01 compute-0 openstack_network_exporter[201859]: ERROR   07:09:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:09:01 compute-0 openstack_network_exporter[201859]: ERROR   07:09:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:09:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:09:01 compute-0 openstack_network_exporter[201859]: ERROR   07:09:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:09:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:09:02 compute-0 podman[211101]: 2025-09-30 07:09:02.504815338 +0000 UTC m=+0.079984096 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 07:09:02 compute-0 podman[211102]: 2025-09-30 07:09:02.558736292 +0000 UTC m=+0.127564157 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Sep 30 07:09:04 compute-0 podman[211147]: 2025-09-30 07:09:04.489432679 +0000 UTC m=+0.071874092 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 07:09:17 compute-0 podman[211167]: 2025-09-30 07:09:17.488915845 +0000 UTC m=+0.068954368 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 07:09:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:09:20.527 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:09:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:09:20.528 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:09:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:09:20.528 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:09:27 compute-0 podman[211193]: 2025-09-30 07:09:27.4769411 +0000 UTC m=+0.060859554 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Sep 30 07:09:29 compute-0 nova_compute[189265]: 2025-09-30 07:09:29.617 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:09:29 compute-0 nova_compute[189265]: 2025-09-30 07:09:29.618 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:09:29 compute-0 podman[199733]: time="2025-09-30T07:09:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:09:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:09:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:09:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:09:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2982 "" "Go-http-client/1.1"
Sep 30 07:09:30 compute-0 nova_compute[189265]: 2025-09-30 07:09:30.139 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:09:30 compute-0 nova_compute[189265]: 2025-09-30 07:09:30.140 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:09:30 compute-0 nova_compute[189265]: 2025-09-30 07:09:30.140 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:09:30 compute-0 nova_compute[189265]: 2025-09-30 07:09:30.140 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:09:30 compute-0 nova_compute[189265]: 2025-09-30 07:09:30.140 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:09:30 compute-0 nova_compute[189265]: 2025-09-30 07:09:30.141 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:09:30 compute-0 nova_compute[189265]: 2025-09-30 07:09:30.141 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:09:30 compute-0 nova_compute[189265]: 2025-09-30 07:09:30.141 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:09:30 compute-0 nova_compute[189265]: 2025-09-30 07:09:30.665 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:09:30 compute-0 nova_compute[189265]: 2025-09-30 07:09:30.666 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:09:30 compute-0 nova_compute[189265]: 2025-09-30 07:09:30.666 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:09:30 compute-0 nova_compute[189265]: 2025-09-30 07:09:30.667 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:09:30 compute-0 nova_compute[189265]: 2025-09-30 07:09:30.845 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:09:30 compute-0 nova_compute[189265]: 2025-09-30 07:09:30.846 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:09:30 compute-0 nova_compute[189265]: 2025-09-30 07:09:30.867 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.021s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:09:30 compute-0 nova_compute[189265]: 2025-09-30 07:09:30.868 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6103MB free_disk=73.34324645996094GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:09:30 compute-0 nova_compute[189265]: 2025-09-30 07:09:30.868 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:09:30 compute-0 nova_compute[189265]: 2025-09-30 07:09:30.868 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:09:31 compute-0 openstack_network_exporter[201859]: ERROR   07:09:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:09:31 compute-0 openstack_network_exporter[201859]: ERROR   07:09:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:09:31 compute-0 openstack_network_exporter[201859]: ERROR   07:09:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:09:31 compute-0 openstack_network_exporter[201859]: ERROR   07:09:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:09:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:09:31 compute-0 openstack_network_exporter[201859]: ERROR   07:09:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:09:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:09:31 compute-0 podman[211214]: 2025-09-30 07:09:31.518791895 +0000 UTC m=+0.102653229 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vcs-type=git, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Sep 30 07:09:31 compute-0 nova_compute[189265]: 2025-09-30 07:09:31.968 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:09:31 compute-0 nova_compute[189265]: 2025-09-30 07:09:31.968 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:09:30 up  1:07,  0 user,  load average: 0.08, 0.48, 0.58\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:09:32 compute-0 nova_compute[189265]: 2025-09-30 07:09:32.013 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Refreshing inventories for resource provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Sep 30 07:09:32 compute-0 nova_compute[189265]: 2025-09-30 07:09:32.082 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Updating ProviderTree inventory for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Sep 30 07:09:32 compute-0 nova_compute[189265]: 2025-09-30 07:09:32.082 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Updating inventory in ProviderTree for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Sep 30 07:09:32 compute-0 nova_compute[189265]: 2025-09-30 07:09:32.099 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Refreshing aggregate associations for resource provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Sep 30 07:09:32 compute-0 nova_compute[189265]: 2025-09-30 07:09:32.128 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Refreshing trait associations for resource provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc, traits: COMPUTE_SECURITY_TPM_CRB,HW_ARCH_X86_64,HW_CPU_X86_F16C,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AESNI,COMPUTE_STORAGE_VIRTIO_FS,HW_CPU_X86_FMA3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE,COMPUTE_ACCELERATORS,HW_CPU_X86_SVM,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_EXTEND,COMPUTE_ARCH_X86_64,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SHA,HW_CPU_X86_BMI,COMPUTE_SOUND_MODEL_USB,COMPUTE_SOUND_MODEL_SB16,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AMD_SVM,HW_CPU_X86_BMI2,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_TIS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_AVX,COMPUTE_SOUND_MODEL_AC97,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_ABM,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_VIF_MODEL_IGB,HW_CPU_X86_SSE41,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SOUND_MODEL_ICH6,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_MMX,HW_CPU_X86_SSE4A,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_PCSPK,HW_CPU_X86_CLMUL _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Sep 30 07:09:32 compute-0 nova_compute[189265]: 2025-09-30 07:09:32.154 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:09:32 compute-0 nova_compute[189265]: 2025-09-30 07:09:32.670 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:09:33 compute-0 nova_compute[189265]: 2025-09-30 07:09:33.211 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:09:33 compute-0 nova_compute[189265]: 2025-09-30 07:09:33.212 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.343s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:09:33 compute-0 podman[211235]: 2025-09-30 07:09:33.506911116 +0000 UTC m=+0.079362578 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=multipathd, config_id=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 07:09:33 compute-0 podman[211236]: 2025-09-30 07:09:33.552628714 +0000 UTC m=+0.120508844 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0)
Sep 30 07:09:35 compute-0 podman[211282]: 2025-09-30 07:09:35.492621529 +0000 UTC m=+0.075139367 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20250930, config_id=ovn_metadata_agent, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Sep 30 07:09:48 compute-0 podman[211302]: 2025-09-30 07:09:48.509934399 +0000 UTC m=+0.084021224 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 07:09:58 compute-0 podman[211327]: 2025-09-30 07:09:58.494224225 +0000 UTC m=+0.075669656 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=iscsid, org.label-schema.schema-version=1.0)
Sep 30 07:09:59 compute-0 podman[199733]: time="2025-09-30T07:09:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:09:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:09:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:09:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:09:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2985 "" "Go-http-client/1.1"
Sep 30 07:10:01 compute-0 openstack_network_exporter[201859]: ERROR   07:10:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:10:01 compute-0 openstack_network_exporter[201859]: ERROR   07:10:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:10:01 compute-0 openstack_network_exporter[201859]: ERROR   07:10:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:10:01 compute-0 openstack_network_exporter[201859]: ERROR   07:10:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:10:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:10:01 compute-0 openstack_network_exporter[201859]: ERROR   07:10:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:10:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:10:02 compute-0 podman[211348]: 2025-09-30 07:10:02.524960898 +0000 UTC m=+0.095887543 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, io.openshift.tags=minimal rhel9, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.expose-services=, architecture=x86_64, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 07:10:04 compute-0 podman[211371]: 2025-09-30 07:10:04.485802166 +0000 UTC m=+0.061867231 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Sep 30 07:10:04 compute-0 podman[211372]: 2025-09-30 07:10:04.529183296 +0000 UTC m=+0.099554588 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Sep 30 07:10:06 compute-0 podman[211414]: 2025-09-30 07:10:06.504050045 +0000 UTC m=+0.079769513 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Sep 30 07:10:19 compute-0 podman[211434]: 2025-09-30 07:10:19.495632865 +0000 UTC m=+0.067064209 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 07:10:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:10:20.529 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:10:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:10:20.529 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:10:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:10:20.529 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:10:29 compute-0 podman[211459]: 2025-09-30 07:10:29.536649774 +0000 UTC m=+0.107667221 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=iscsid, org.label-schema.build-date=20250930)
Sep 30 07:10:29 compute-0 podman[199733]: time="2025-09-30T07:10:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:10:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:10:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:10:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:10:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2981 "" "Go-http-client/1.1"
Sep 30 07:10:31 compute-0 openstack_network_exporter[201859]: ERROR   07:10:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:10:31 compute-0 openstack_network_exporter[201859]: ERROR   07:10:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:10:31 compute-0 openstack_network_exporter[201859]: ERROR   07:10:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:10:31 compute-0 openstack_network_exporter[201859]: ERROR   07:10:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:10:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:10:31 compute-0 openstack_network_exporter[201859]: ERROR   07:10:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:10:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:10:33 compute-0 nova_compute[189265]: 2025-09-30 07:10:33.213 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:10:33 compute-0 nova_compute[189265]: 2025-09-30 07:10:33.213 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:10:33 compute-0 nova_compute[189265]: 2025-09-30 07:10:33.214 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:10:33 compute-0 nova_compute[189265]: 2025-09-30 07:10:33.214 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:10:33 compute-0 nova_compute[189265]: 2025-09-30 07:10:33.214 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:10:33 compute-0 nova_compute[189265]: 2025-09-30 07:10:33.214 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:10:33 compute-0 nova_compute[189265]: 2025-09-30 07:10:33.214 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:10:33 compute-0 nova_compute[189265]: 2025-09-30 07:10:33.214 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:10:33 compute-0 nova_compute[189265]: 2025-09-30 07:10:33.215 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:10:33 compute-0 podman[211479]: 2025-09-30 07:10:33.470351681 +0000 UTC m=+0.058010299 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.expose-services=, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vendor=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7)
Sep 30 07:10:33 compute-0 nova_compute[189265]: 2025-09-30 07:10:33.724 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:10:33 compute-0 nova_compute[189265]: 2025-09-30 07:10:33.725 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:10:33 compute-0 nova_compute[189265]: 2025-09-30 07:10:33.725 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:10:33 compute-0 nova_compute[189265]: 2025-09-30 07:10:33.725 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:10:33 compute-0 nova_compute[189265]: 2025-09-30 07:10:33.866 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:10:33 compute-0 nova_compute[189265]: 2025-09-30 07:10:33.867 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:10:33 compute-0 nova_compute[189265]: 2025-09-30 07:10:33.879 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.012s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:10:33 compute-0 nova_compute[189265]: 2025-09-30 07:10:33.880 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6093MB free_disk=73.34326553344727GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:10:33 compute-0 nova_compute[189265]: 2025-09-30 07:10:33.880 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:10:33 compute-0 nova_compute[189265]: 2025-09-30 07:10:33.880 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:10:34 compute-0 nova_compute[189265]: 2025-09-30 07:10:34.936 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:10:34 compute-0 nova_compute[189265]: 2025-09-30 07:10:34.936 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:10:33 up  1:08,  0 user,  load average: 0.03, 0.38, 0.54\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:10:34 compute-0 nova_compute[189265]: 2025-09-30 07:10:34.959 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:10:35 compute-0 nova_compute[189265]: 2025-09-30 07:10:35.466 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:10:35 compute-0 podman[211503]: 2025-09-30 07:10:35.467354642 +0000 UTC m=+0.059174014 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Sep 30 07:10:35 compute-0 podman[211504]: 2025-09-30 07:10:35.511041581 +0000 UTC m=+0.099360083 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4)
Sep 30 07:10:35 compute-0 nova_compute[189265]: 2025-09-30 07:10:35.975 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:10:35 compute-0 nova_compute[189265]: 2025-09-30 07:10:35.976 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.095s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:10:37 compute-0 podman[211549]: 2025-09-30 07:10:37.489098841 +0000 UTC m=+0.068574993 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 07:10:50 compute-0 podman[211569]: 2025-09-30 07:10:50.492686755 +0000 UTC m=+0.074056880 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 07:10:59 compute-0 podman[199733]: time="2025-09-30T07:10:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:10:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:10:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:10:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:10:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2979 "" "Go-http-client/1.1"
Sep 30 07:11:00 compute-0 podman[211593]: 2025-09-30 07:11:00.518549861 +0000 UTC m=+0.100836955 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Sep 30 07:11:01 compute-0 openstack_network_exporter[201859]: ERROR   07:11:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:11:01 compute-0 openstack_network_exporter[201859]: ERROR   07:11:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:11:01 compute-0 openstack_network_exporter[201859]: ERROR   07:11:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:11:01 compute-0 openstack_network_exporter[201859]: ERROR   07:11:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:11:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:11:01 compute-0 openstack_network_exporter[201859]: ERROR   07:11:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:11:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:11:04 compute-0 podman[211613]: 2025-09-30 07:11:04.491628225 +0000 UTC m=+0.074594494 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.buildah.version=1.33.7, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=edpm_ansible, name=ubi9-minimal, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 07:11:06 compute-0 podman[211636]: 2025-09-30 07:11:06.489291885 +0000 UTC m=+0.071593919 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20250930, managed_by=edpm_ansible, tcib_managed=true)
Sep 30 07:11:06 compute-0 podman[211637]: 2025-09-30 07:11:06.495191313 +0000 UTC m=+0.082474989 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, container_name=ovn_controller, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 07:11:08 compute-0 podman[211682]: 2025-09-30 07:11:08.462475535 +0000 UTC m=+0.049353283 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 07:11:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:11:20.530 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:11:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:11:20.530 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:11:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:11:20.530 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:11:21 compute-0 podman[211704]: 2025-09-30 07:11:21.508962976 +0000 UTC m=+0.080471123 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 07:11:28 compute-0 unix_chkpwd[211730]: password check failed for user (root)
Sep 30 07:11:28 compute-0 sshd-session[211728]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.79  user=root
Sep 30 07:11:29 compute-0 podman[199733]: time="2025-09-30T07:11:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:11:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:11:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:11:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:11:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2981 "" "Go-http-client/1.1"
Sep 30 07:11:30 compute-0 sshd-session[211728]: Failed password for root from 91.224.92.79 port 20954 ssh2
Sep 30 07:11:30 compute-0 unix_chkpwd[211732]: password check failed for user (root)
Sep 30 07:11:31 compute-0 openstack_network_exporter[201859]: ERROR   07:11:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:11:31 compute-0 openstack_network_exporter[201859]: ERROR   07:11:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:11:31 compute-0 openstack_network_exporter[201859]: ERROR   07:11:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:11:31 compute-0 openstack_network_exporter[201859]: ERROR   07:11:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:11:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:11:31 compute-0 openstack_network_exporter[201859]: ERROR   07:11:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:11:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:11:31 compute-0 podman[211733]: 2025-09-30 07:11:31.512449441 +0000 UTC m=+0.085772144 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Sep 30 07:11:32 compute-0 sshd-session[211728]: Failed password for root from 91.224.92.79 port 20954 ssh2
Sep 30 07:11:32 compute-0 nova_compute[189265]: 2025-09-30 07:11:32.546 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:11:32 compute-0 nova_compute[189265]: 2025-09-30 07:11:32.546 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:11:32 compute-0 unix_chkpwd[211754]: password check failed for user (root)
Sep 30 07:11:33 compute-0 nova_compute[189265]: 2025-09-30 07:11:33.064 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:11:33 compute-0 nova_compute[189265]: 2025-09-30 07:11:33.064 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:11:33 compute-0 nova_compute[189265]: 2025-09-30 07:11:33.064 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:11:33 compute-0 nova_compute[189265]: 2025-09-30 07:11:33.065 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:11:33 compute-0 nova_compute[189265]: 2025-09-30 07:11:33.065 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:11:33 compute-0 nova_compute[189265]: 2025-09-30 07:11:33.065 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:11:33 compute-0 nova_compute[189265]: 2025-09-30 07:11:33.066 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:11:33 compute-0 nova_compute[189265]: 2025-09-30 07:11:33.066 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:11:33 compute-0 nova_compute[189265]: 2025-09-30 07:11:33.584 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:11:33 compute-0 nova_compute[189265]: 2025-09-30 07:11:33.585 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:11:33 compute-0 nova_compute[189265]: 2025-09-30 07:11:33.585 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:11:33 compute-0 nova_compute[189265]: 2025-09-30 07:11:33.585 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:11:33 compute-0 nova_compute[189265]: 2025-09-30 07:11:33.762 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:11:33 compute-0 nova_compute[189265]: 2025-09-30 07:11:33.764 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:11:33 compute-0 nova_compute[189265]: 2025-09-30 07:11:33.780 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.016s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:11:33 compute-0 nova_compute[189265]: 2025-09-30 07:11:33.780 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6097MB free_disk=73.34324264526367GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:11:33 compute-0 nova_compute[189265]: 2025-09-30 07:11:33.781 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:11:33 compute-0 nova_compute[189265]: 2025-09-30 07:11:33.781 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:11:35 compute-0 sshd-session[211728]: Failed password for root from 91.224.92.79 port 20954 ssh2
Sep 30 07:11:35 compute-0 nova_compute[189265]: 2025-09-30 07:11:35.385 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:11:35 compute-0 nova_compute[189265]: 2025-09-30 07:11:35.385 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:11:33 up  1:09,  0 user,  load average: 0.01, 0.31, 0.50\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:11:35 compute-0 nova_compute[189265]: 2025-09-30 07:11:35.417 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:11:35 compute-0 podman[211756]: 2025-09-30 07:11:35.535494684 +0000 UTC m=+0.105649722 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vendor=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, managed_by=edpm_ansible)
Sep 30 07:11:36 compute-0 nova_compute[189265]: 2025-09-30 07:11:36.240 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:11:36 compute-0 sshd-session[211728]: Received disconnect from 91.224.92.79 port 20954:11:  [preauth]
Sep 30 07:11:36 compute-0 sshd-session[211728]: Disconnected from authenticating user root 91.224.92.79 port 20954 [preauth]
Sep 30 07:11:36 compute-0 sshd-session[211728]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.79  user=root
Sep 30 07:11:36 compute-0 nova_compute[189265]: 2025-09-30 07:11:36.881 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:11:36 compute-0 nova_compute[189265]: 2025-09-30 07:11:36.881 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.100s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:11:37 compute-0 podman[211780]: 2025-09-30 07:11:37.520339549 +0000 UTC m=+0.090449218 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Sep 30 07:11:37 compute-0 podman[211781]: 2025-09-30 07:11:37.550729458 +0000 UTC m=+0.112815608 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20250930)
Sep 30 07:11:37 compute-0 unix_chkpwd[211826]: password check failed for user (root)
Sep 30 07:11:37 compute-0 sshd-session[211778]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.79  user=root
Sep 30 07:11:39 compute-0 podman[211827]: 2025-09-30 07:11:39.501027003 +0000 UTC m=+0.081478531 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Sep 30 07:11:39 compute-0 sshd-session[211778]: Failed password for root from 91.224.92.79 port 32880 ssh2
Sep 30 07:11:41 compute-0 unix_chkpwd[211847]: password check failed for user (root)
Sep 30 07:11:43 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:11:43.842 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '1a:26:7c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2e:60:fa:91:d0:34'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:11:43 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:11:43.843 100322 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 07:11:43 compute-0 sshd-session[211778]: Failed password for root from 91.224.92.79 port 32880 ssh2
Sep 30 07:11:46 compute-0 unix_chkpwd[211849]: password check failed for user (root)
Sep 30 07:11:47 compute-0 sshd-session[211778]: Failed password for root from 91.224.92.79 port 32880 ssh2
Sep 30 07:11:47 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:11:47.879 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:25:a1:04 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-28ce26c5-5aef-4180-a5db-bc4a1ede5db8', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-28ce26c5-5aef-4180-a5db-bc4a1ede5db8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '845385b9c91b47df88d3ec00d6d6f1dd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2c774eb2-ee2a-496d-bf51-c0d978319afe, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=345e1126-0b34-4939-bbe1-8907bddd4710) old=Port_Binding(mac=['fa:16:3e:25:a1:04'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-28ce26c5-5aef-4180-a5db-bc4a1ede5db8', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-28ce26c5-5aef-4180-a5db-bc4a1ede5db8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '845385b9c91b47df88d3ec00d6d6f1dd', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:11:47 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:11:47.880 100322 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 345e1126-0b34-4939-bbe1-8907bddd4710 in datapath 28ce26c5-5aef-4180-a5db-bc4a1ede5db8 updated
Sep 30 07:11:47 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:11:47.882 100322 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 28ce26c5-5aef-4180-a5db-bc4a1ede5db8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 07:11:47 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:11:47.883 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[04b6de6b-3677-4d37-abee-6932754a9e95]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:11:48 compute-0 sshd-session[211778]: Received disconnect from 91.224.92.79 port 32880:11:  [preauth]
Sep 30 07:11:48 compute-0 sshd-session[211778]: Disconnected from authenticating user root 91.224.92.79 port 32880 [preauth]
Sep 30 07:11:48 compute-0 sshd-session[211778]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.79  user=root
Sep 30 07:11:49 compute-0 unix_chkpwd[211852]: password check failed for user (root)
Sep 30 07:11:49 compute-0 sshd-session[211850]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.79  user=root
Sep 30 07:11:51 compute-0 sshd-session[211850]: Failed password for root from 91.224.92.79 port 35318 ssh2
Sep 30 07:11:51 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:11:51.845 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=01429670-4ea1-4dab-babc-4bc628cc01bb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:11:52 compute-0 podman[211853]: 2025-09-30 07:11:52.45367447 +0000 UTC m=+0.040902911 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 07:11:53 compute-0 unix_chkpwd[211879]: password check failed for user (root)
Sep 30 07:11:54 compute-0 sshd-session[211850]: Failed password for root from 91.224.92.79 port 35318 ssh2
Sep 30 07:11:55 compute-0 unix_chkpwd[211880]: password check failed for user (root)
Sep 30 07:11:57 compute-0 sshd-session[211850]: Failed password for root from 91.224.92.79 port 35318 ssh2
Sep 30 07:11:59 compute-0 sshd-session[211850]: Received disconnect from 91.224.92.79 port 35318:11:  [preauth]
Sep 30 07:11:59 compute-0 sshd-session[211850]: Disconnected from authenticating user root 91.224.92.79 port 35318 [preauth]
Sep 30 07:11:59 compute-0 sshd-session[211850]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.79  user=root
Sep 30 07:11:59 compute-0 podman[199733]: time="2025-09-30T07:11:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:11:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:11:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:11:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:11:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2982 "" "Go-http-client/1.1"
Sep 30 07:12:01 compute-0 openstack_network_exporter[201859]: ERROR   07:12:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:12:01 compute-0 openstack_network_exporter[201859]: ERROR   07:12:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:12:01 compute-0 openstack_network_exporter[201859]: ERROR   07:12:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:12:01 compute-0 openstack_network_exporter[201859]: ERROR   07:12:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:12:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:12:01 compute-0 openstack_network_exporter[201859]: ERROR   07:12:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:12:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:12:01 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:12:01.482 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b1:3b:eb 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-37855c85-a81c-4cc4-b274-c1d1a851e5d0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-37855c85-a81c-4cc4-b274-c1d1a851e5d0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'feae7fe733584b77a828fa9645022986', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b37dccac-2213-44ce-b6e2-ab7bc3d644ea, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=3f7e27f2-f7c5-4b9e-ba55-ba3da9700595) old=Port_Binding(mac=['fa:16:3e:b1:3b:eb'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-37855c85-a81c-4cc4-b274-c1d1a851e5d0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-37855c85-a81c-4cc4-b274-c1d1a851e5d0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'feae7fe733584b77a828fa9645022986', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:12:01 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:12:01.483 100322 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 3f7e27f2-f7c5-4b9e-ba55-ba3da9700595 in datapath 37855c85-a81c-4cc4-b274-c1d1a851e5d0 updated
Sep 30 07:12:01 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:12:01.483 100322 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 37855c85-a81c-4cc4-b274-c1d1a851e5d0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 07:12:01 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:12:01.484 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[c04af5f7-8d49-43be-9540-a25b3512f702]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:12:02 compute-0 podman[211881]: 2025-09-30 07:12:02.460351187 +0000 UTC m=+0.052269686 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Sep 30 07:12:06 compute-0 podman[211902]: 2025-09-30 07:12:06.478228632 +0000 UTC m=+0.066781131 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, distribution-scope=public, io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, version=9.6, managed_by=edpm_ansible, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, build-date=2025-08-20T13:12:41)
Sep 30 07:12:08 compute-0 podman[211924]: 2025-09-30 07:12:08.493091754 +0000 UTC m=+0.074346677 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, managed_by=edpm_ansible)
Sep 30 07:12:08 compute-0 podman[211925]: 2025-09-30 07:12:08.573937116 +0000 UTC m=+0.140100688 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Sep 30 07:12:10 compute-0 podman[211968]: 2025-09-30 07:12:10.492618357 +0000 UTC m=+0.079258377 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Sep 30 07:12:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:12:20.531 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:12:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:12:20.532 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:12:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:12:20.532 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:12:23 compute-0 podman[211988]: 2025-09-30 07:12:23.521242638 +0000 UTC m=+0.100430314 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 07:12:29 compute-0 podman[199733]: time="2025-09-30T07:12:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:12:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:12:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:12:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:12:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2979 "" "Go-http-client/1.1"
Sep 30 07:12:31 compute-0 openstack_network_exporter[201859]: ERROR   07:12:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:12:31 compute-0 openstack_network_exporter[201859]: ERROR   07:12:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:12:31 compute-0 openstack_network_exporter[201859]: ERROR   07:12:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:12:31 compute-0 openstack_network_exporter[201859]: ERROR   07:12:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:12:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:12:31 compute-0 openstack_network_exporter[201859]: ERROR   07:12:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:12:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:12:33 compute-0 podman[212012]: 2025-09-30 07:12:33.476179627 +0000 UTC m=+0.064889007 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=iscsid)
Sep 30 07:12:36 compute-0 nova_compute[189265]: 2025-09-30 07:12:36.883 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:12:36 compute-0 nova_compute[189265]: 2025-09-30 07:12:36.883 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:12:36 compute-0 nova_compute[189265]: 2025-09-30 07:12:36.884 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:12:36 compute-0 nova_compute[189265]: 2025-09-30 07:12:36.884 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:12:36 compute-0 nova_compute[189265]: 2025-09-30 07:12:36.885 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:12:36 compute-0 nova_compute[189265]: 2025-09-30 07:12:36.885 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:12:36 compute-0 nova_compute[189265]: 2025-09-30 07:12:36.886 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:12:36 compute-0 nova_compute[189265]: 2025-09-30 07:12:36.886 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:12:36 compute-0 nova_compute[189265]: 2025-09-30 07:12:36.887 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:12:37 compute-0 nova_compute[189265]: 2025-09-30 07:12:37.399 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:12:37 compute-0 nova_compute[189265]: 2025-09-30 07:12:37.399 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:12:37 compute-0 nova_compute[189265]: 2025-09-30 07:12:37.399 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:12:37 compute-0 nova_compute[189265]: 2025-09-30 07:12:37.399 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:12:37 compute-0 podman[212032]: 2025-09-30 07:12:37.486619529 +0000 UTC m=+0.071102346 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, version=9.6, architecture=x86_64, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, name=ubi9-minimal, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Sep 30 07:12:37 compute-0 nova_compute[189265]: 2025-09-30 07:12:37.560 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:12:37 compute-0 nova_compute[189265]: 2025-09-30 07:12:37.561 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:12:37 compute-0 nova_compute[189265]: 2025-09-30 07:12:37.581 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.020s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:12:37 compute-0 nova_compute[189265]: 2025-09-30 07:12:37.582 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6087MB free_disk=73.34280776977539GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:12:37 compute-0 nova_compute[189265]: 2025-09-30 07:12:37.582 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:12:37 compute-0 nova_compute[189265]: 2025-09-30 07:12:37.582 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:12:38 compute-0 nova_compute[189265]: 2025-09-30 07:12:38.703 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:12:38 compute-0 nova_compute[189265]: 2025-09-30 07:12:38.704 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:12:37 up  1:10,  0 user,  load average: 0.24, 0.32, 0.49\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:12:38 compute-0 nova_compute[189265]: 2025-09-30 07:12:38.722 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:12:39 compute-0 nova_compute[189265]: 2025-09-30 07:12:39.279 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:12:39 compute-0 podman[212055]: 2025-09-30 07:12:39.51333041 +0000 UTC m=+0.077892288 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Sep 30 07:12:39 compute-0 podman[212056]: 2025-09-30 07:12:39.530586974 +0000 UTC m=+0.096154301 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Sep 30 07:12:39 compute-0 nova_compute[189265]: 2025-09-30 07:12:39.799 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:12:39 compute-0 nova_compute[189265]: 2025-09-30 07:12:39.800 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.217s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:12:41 compute-0 podman[212100]: 2025-09-30 07:12:41.50469052 +0000 UTC m=+0.077673663 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 07:12:49 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:12:49.694 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '1a:26:7c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2e:60:fa:91:d0:34'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:12:49 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:12:49.695 100322 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 07:12:50 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:12:50.696 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=01429670-4ea1-4dab-babc-4bc628cc01bb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:12:54 compute-0 podman[212120]: 2025-09-30 07:12:54.463162034 +0000 UTC m=+0.044487703 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 07:12:59 compute-0 podman[199733]: time="2025-09-30T07:12:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:12:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:12:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:12:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:12:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2984 "" "Go-http-client/1.1"
Sep 30 07:13:01 compute-0 openstack_network_exporter[201859]: ERROR   07:13:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:13:01 compute-0 openstack_network_exporter[201859]: ERROR   07:13:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:13:01 compute-0 openstack_network_exporter[201859]: ERROR   07:13:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:13:01 compute-0 openstack_network_exporter[201859]: ERROR   07:13:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:13:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:13:01 compute-0 openstack_network_exporter[201859]: ERROR   07:13:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:13:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:13:04 compute-0 podman[212144]: 2025-09-30 07:13:04.499753536 +0000 UTC m=+0.080779011 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Sep 30 07:13:08 compute-0 podman[212164]: 2025-09-30 07:13:08.467315942 +0000 UTC m=+0.053488601 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, io.openshift.expose-services=, name=ubi9-minimal, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, managed_by=edpm_ansible, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container)
Sep 30 07:13:10 compute-0 podman[212185]: 2025-09-30 07:13:10.504623977 +0000 UTC m=+0.084176769 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Sep 30 07:13:10 compute-0 podman[212186]: 2025-09-30 07:13:10.563603614 +0000 UTC m=+0.138494923 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_controller, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 07:13:12 compute-0 podman[212231]: 2025-09-30 07:13:12.507749623 +0000 UTC m=+0.081322837 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Sep 30 07:13:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:13:20.534 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:13:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:13:20.534 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:13:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:13:20.534 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:13:25 compute-0 podman[212251]: 2025-09-30 07:13:25.515288209 +0000 UTC m=+0.090329714 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 07:13:28 compute-0 nova_compute[189265]: 2025-09-30 07:13:28.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:13:28 compute-0 nova_compute[189265]: 2025-09-30 07:13:28.788 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Sep 30 07:13:29 compute-0 nova_compute[189265]: 2025-09-30 07:13:29.346 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Sep 30 07:13:29 compute-0 podman[199733]: time="2025-09-30T07:13:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:13:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:13:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:13:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:13:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2983 "" "Go-http-client/1.1"
Sep 30 07:13:30 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:13:30.145 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1f:ef:af 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74ffbf65-ebbd-4587-bf5b-0b38421a4813', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b1dc2a906d2242f79ffab81c2cf3c4d7', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7b541691-433c-426c-b8b7-10d79319603a, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=0c700e20-e593-4a77-93d7-fc919dc1f294) old=Port_Binding(mac=['fa:16:3e:1f:ef:af'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74ffbf65-ebbd-4587-bf5b-0b38421a4813', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b1dc2a906d2242f79ffab81c2cf3c4d7', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:13:30 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:13:30.146 100322 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 0c700e20-e593-4a77-93d7-fc919dc1f294 in datapath 74ffbf65-ebbd-4587-bf5b-0b38421a4813 updated
Sep 30 07:13:30 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:13:30.146 100322 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 74ffbf65-ebbd-4587-bf5b-0b38421a4813, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 07:13:30 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:13:30.147 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[fa5afd5d-423e-4abf-998c-eadc4d17a473]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:13:30 compute-0 nova_compute[189265]: 2025-09-30 07:13:30.341 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:13:30 compute-0 nova_compute[189265]: 2025-09-30 07:13:30.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:13:30 compute-0 nova_compute[189265]: 2025-09-30 07:13:30.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:13:30 compute-0 nova_compute[189265]: 2025-09-30 07:13:30.789 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:13:31 compute-0 openstack_network_exporter[201859]: ERROR   07:13:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:13:31 compute-0 openstack_network_exporter[201859]: ERROR   07:13:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:13:31 compute-0 openstack_network_exporter[201859]: ERROR   07:13:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:13:31 compute-0 openstack_network_exporter[201859]: ERROR   07:13:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:13:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:13:31 compute-0 openstack_network_exporter[201859]: ERROR   07:13:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:13:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:13:31 compute-0 nova_compute[189265]: 2025-09-30 07:13:31.783 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:13:32 compute-0 nova_compute[189265]: 2025-09-30 07:13:32.351 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:13:32 compute-0 nova_compute[189265]: 2025-09-30 07:13:32.352 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:13:32 compute-0 nova_compute[189265]: 2025-09-30 07:13:32.352 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:13:32 compute-0 nova_compute[189265]: 2025-09-30 07:13:32.352 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Sep 30 07:13:33 compute-0 nova_compute[189265]: 2025-09-30 07:13:33.350 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:13:33 compute-0 nova_compute[189265]: 2025-09-30 07:13:33.878 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:13:33 compute-0 nova_compute[189265]: 2025-09-30 07:13:33.879 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:13:33 compute-0 nova_compute[189265]: 2025-09-30 07:13:33.879 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:13:33 compute-0 nova_compute[189265]: 2025-09-30 07:13:33.879 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:13:34 compute-0 nova_compute[189265]: 2025-09-30 07:13:34.031 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:13:34 compute-0 nova_compute[189265]: 2025-09-30 07:13:34.032 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:13:34 compute-0 nova_compute[189265]: 2025-09-30 07:13:34.060 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.028s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:13:34 compute-0 nova_compute[189265]: 2025-09-30 07:13:34.060 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6090MB free_disk=73.34280776977539GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:13:34 compute-0 nova_compute[189265]: 2025-09-30 07:13:34.061 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:13:34 compute-0 nova_compute[189265]: 2025-09-30 07:13:34.061 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:13:35 compute-0 nova_compute[189265]: 2025-09-30 07:13:35.214 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:13:35 compute-0 nova_compute[189265]: 2025-09-30 07:13:35.214 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:13:34 up  1:11,  0 user,  load average: 0.24, 0.30, 0.47\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:13:35 compute-0 nova_compute[189265]: 2025-09-30 07:13:35.235 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:13:35 compute-0 podman[212277]: 2025-09-30 07:13:35.489313932 +0000 UTC m=+0.068212721 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, container_name=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Sep 30 07:13:35 compute-0 nova_compute[189265]: 2025-09-30 07:13:35.866 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:13:36 compute-0 nova_compute[189265]: 2025-09-30 07:13:36.536 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:13:36 compute-0 nova_compute[189265]: 2025-09-30 07:13:36.537 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.476s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:13:36 compute-0 nova_compute[189265]: 2025-09-30 07:13:36.537 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:13:37 compute-0 nova_compute[189265]: 2025-09-30 07:13:37.563 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:13:37 compute-0 nova_compute[189265]: 2025-09-30 07:13:37.564 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:13:39 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:13:39.359 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f2:71:2a 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-adc0d2ec-3cb2-473b-97af-5c4a23479334', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-adc0d2ec-3cb2-473b-97af-5c4a23479334', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1413b21c2db845e58d8a81f524a55f3a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1ebecf7b-c8a5-4b28-8079-e917d595156f, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=e879968c-667e-4556-8f08-974832164a96) old=Port_Binding(mac=['fa:16:3e:f2:71:2a'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-adc0d2ec-3cb2-473b-97af-5c4a23479334', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-adc0d2ec-3cb2-473b-97af-5c4a23479334', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1413b21c2db845e58d8a81f524a55f3a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:13:39 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:13:39.361 100322 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port e879968c-667e-4556-8f08-974832164a96 in datapath adc0d2ec-3cb2-473b-97af-5c4a23479334 updated
Sep 30 07:13:39 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:13:39.363 100322 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network adc0d2ec-3cb2-473b-97af-5c4a23479334, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 07:13:39 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:13:39.363 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[f9cc3c50-8dec-4a19-a822-a7d875aad7bd]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:13:39 compute-0 podman[212297]: 2025-09-30 07:13:39.509568647 +0000 UTC m=+0.084003683 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, maintainer=Red Hat, Inc., version=9.6, config_id=edpm, name=ubi9-minimal, release=1755695350, io.openshift.expose-services=, container_name=openstack_network_exporter, distribution-scope=public)
Sep 30 07:13:41 compute-0 podman[212319]: 2025-09-30 07:13:41.512592061 +0000 UTC m=+0.098121788 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 07:13:41 compute-0 podman[212320]: 2025-09-30 07:13:41.543967848 +0000 UTC m=+0.126043506 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.build-date=20250930, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Sep 30 07:13:43 compute-0 podman[212365]: 2025-09-30 07:13:43.4627176 +0000 UTC m=+0.054824410 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 07:13:49 compute-0 nova_compute[189265]: 2025-09-30 07:13:49.131 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:13:56 compute-0 podman[212385]: 2025-09-30 07:13:56.490178286 +0000 UTC m=+0.072206003 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 07:13:59 compute-0 podman[199733]: time="2025-09-30T07:13:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:13:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:13:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:13:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:13:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2981 "" "Go-http-client/1.1"
Sep 30 07:14:01 compute-0 openstack_network_exporter[201859]: ERROR   07:14:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:14:01 compute-0 openstack_network_exporter[201859]: ERROR   07:14:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:14:01 compute-0 openstack_network_exporter[201859]: ERROR   07:14:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:14:01 compute-0 openstack_network_exporter[201859]: ERROR   07:14:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:14:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:14:01 compute-0 openstack_network_exporter[201859]: ERROR   07:14:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:14:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:14:06 compute-0 podman[212411]: 2025-09-30 07:14:06.486330735 +0000 UTC m=+0.068780844 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Sep 30 07:14:09 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:14:09.355 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '1a:26:7c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2e:60:fa:91:d0:34'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:14:09 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:14:09.357 100322 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 07:14:10 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:14:10.358 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=01429670-4ea1-4dab-babc-4bc628cc01bb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:14:10 compute-0 podman[212432]: 2025-09-30 07:14:10.508092124 +0000 UTC m=+0.085060964 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, maintainer=Red Hat, Inc., release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc.)
Sep 30 07:14:12 compute-0 podman[212454]: 2025-09-30 07:14:12.458311488 +0000 UTC m=+0.049176280 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.4, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team)
Sep 30 07:14:12 compute-0 podman[212455]: 2025-09-30 07:14:12.485241514 +0000 UTC m=+0.074264783 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS)
Sep 30 07:14:14 compute-0 podman[212498]: 2025-09-30 07:14:14.508583958 +0000 UTC m=+0.087262387 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Sep 30 07:14:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:14:20.535 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:14:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:14:20.536 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:14:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:14:20.536 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:14:27 compute-0 podman[212518]: 2025-09-30 07:14:27.487148825 +0000 UTC m=+0.064357847 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 07:14:28 compute-0 nova_compute[189265]: 2025-09-30 07:14:28.722 2 DEBUG oslo_concurrency.lockutils [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Acquiring lock "9fa193fb-a398-4552-85b4-a346dffcf697" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:14:28 compute-0 nova_compute[189265]: 2025-09-30 07:14:28.723 2 DEBUG oslo_concurrency.lockutils [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "9fa193fb-a398-4552-85b4-a346dffcf697" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:14:29 compute-0 nova_compute[189265]: 2025-09-30 07:14:29.240 2 DEBUG nova.compute.manager [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Sep 30 07:14:29 compute-0 podman[199733]: time="2025-09-30T07:14:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:14:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:14:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:14:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:14:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2980 "" "Go-http-client/1.1"
Sep 30 07:14:30 compute-0 nova_compute[189265]: 2025-09-30 07:14:30.100 2 DEBUG oslo_concurrency.lockutils [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:14:30 compute-0 nova_compute[189265]: 2025-09-30 07:14:30.102 2 DEBUG oslo_concurrency.lockutils [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:14:30 compute-0 nova_compute[189265]: 2025-09-30 07:14:30.112 2 DEBUG nova.virt.hardware [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Sep 30 07:14:30 compute-0 nova_compute[189265]: 2025-09-30 07:14:30.113 2 INFO nova.compute.claims [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Claim successful on node compute-0.ctlplane.example.com
Sep 30 07:14:31 compute-0 nova_compute[189265]: 2025-09-30 07:14:31.311 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:14:31 compute-0 nova_compute[189265]: 2025-09-30 07:14:31.312 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:14:31 compute-0 nova_compute[189265]: 2025-09-30 07:14:31.324 2 DEBUG nova.compute.provider_tree [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:14:31 compute-0 openstack_network_exporter[201859]: ERROR   07:14:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:14:31 compute-0 openstack_network_exporter[201859]: ERROR   07:14:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:14:31 compute-0 openstack_network_exporter[201859]: ERROR   07:14:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:14:31 compute-0 openstack_network_exporter[201859]: ERROR   07:14:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:14:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:14:31 compute-0 openstack_network_exporter[201859]: ERROR   07:14:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:14:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:14:31 compute-0 nova_compute[189265]: 2025-09-30 07:14:31.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:14:31 compute-0 nova_compute[189265]: 2025-09-30 07:14:31.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:14:31 compute-0 nova_compute[189265]: 2025-09-30 07:14:31.838 2 DEBUG nova.scheduler.client.report [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:14:32 compute-0 nova_compute[189265]: 2025-09-30 07:14:32.350 2 DEBUG oslo_concurrency.lockutils [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.248s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:14:32 compute-0 nova_compute[189265]: 2025-09-30 07:14:32.351 2 DEBUG nova.compute.manager [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Sep 30 07:14:32 compute-0 nova_compute[189265]: 2025-09-30 07:14:32.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:14:32 compute-0 nova_compute[189265]: 2025-09-30 07:14:32.788 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:14:32 compute-0 nova_compute[189265]: 2025-09-30 07:14:32.901 2 DEBUG nova.compute.manager [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Sep 30 07:14:32 compute-0 nova_compute[189265]: 2025-09-30 07:14:32.901 2 DEBUG nova.network.neutron [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Sep 30 07:14:32 compute-0 nova_compute[189265]: 2025-09-30 07:14:32.903 2 WARNING neutronclient.v2_0.client [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:14:32 compute-0 nova_compute[189265]: 2025-09-30 07:14:32.906 2 WARNING neutronclient.v2_0.client [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:14:33 compute-0 nova_compute[189265]: 2025-09-30 07:14:33.421 2 INFO nova.virt.libvirt.driver [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 07:14:33 compute-0 nova_compute[189265]: 2025-09-30 07:14:33.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:14:33 compute-0 nova_compute[189265]: 2025-09-30 07:14:33.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:14:33 compute-0 nova_compute[189265]: 2025-09-30 07:14:33.985 2 DEBUG nova.compute.manager [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Sep 30 07:14:34 compute-0 nova_compute[189265]: 2025-09-30 07:14:34.311 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:14:34 compute-0 nova_compute[189265]: 2025-09-30 07:14:34.312 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:14:34 compute-0 nova_compute[189265]: 2025-09-30 07:14:34.313 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:14:34 compute-0 nova_compute[189265]: 2025-09-30 07:14:34.313 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:14:34 compute-0 nova_compute[189265]: 2025-09-30 07:14:34.501 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:14:34 compute-0 nova_compute[189265]: 2025-09-30 07:14:34.502 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:14:34 compute-0 nova_compute[189265]: 2025-09-30 07:14:34.515 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.013s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:14:34 compute-0 nova_compute[189265]: 2025-09-30 07:14:34.516 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=6098MB free_disk=73.34327697753906GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:14:34 compute-0 nova_compute[189265]: 2025-09-30 07:14:34.516 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:14:34 compute-0 nova_compute[189265]: 2025-09-30 07:14:34.517 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:14:35 compute-0 nova_compute[189265]: 2025-09-30 07:14:35.010 2 DEBUG nova.compute.manager [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Sep 30 07:14:35 compute-0 nova_compute[189265]: 2025-09-30 07:14:35.014 2 DEBUG nova.virt.libvirt.driver [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Sep 30 07:14:35 compute-0 nova_compute[189265]: 2025-09-30 07:14:35.015 2 INFO nova.virt.libvirt.driver [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Creating image(s)
Sep 30 07:14:35 compute-0 nova_compute[189265]: 2025-09-30 07:14:35.016 2 DEBUG oslo_concurrency.lockutils [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Acquiring lock "/var/lib/nova/instances/9fa193fb-a398-4552-85b4-a346dffcf697/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:14:35 compute-0 nova_compute[189265]: 2025-09-30 07:14:35.017 2 DEBUG oslo_concurrency.lockutils [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "/var/lib/nova/instances/9fa193fb-a398-4552-85b4-a346dffcf697/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:14:35 compute-0 nova_compute[189265]: 2025-09-30 07:14:35.018 2 DEBUG oslo_concurrency.lockutils [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "/var/lib/nova/instances/9fa193fb-a398-4552-85b4-a346dffcf697/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:14:35 compute-0 nova_compute[189265]: 2025-09-30 07:14:35.019 2 DEBUG oslo_concurrency.lockutils [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Acquiring lock "649c128805005f3dfb5a93843c58a367cdfe939d" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:14:35 compute-0 nova_compute[189265]: 2025-09-30 07:14:35.020 2 DEBUG oslo_concurrency.lockutils [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "649c128805005f3dfb5a93843c58a367cdfe939d" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:14:35 compute-0 nova_compute[189265]: 2025-09-30 07:14:35.155 2 DEBUG nova.network.neutron [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Successfully created port: 5e18274a-8ca2-4391-88b8-e5a90d72fc7c _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Sep 30 07:14:35 compute-0 nova_compute[189265]: 2025-09-30 07:14:35.563 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Instance 9fa193fb-a398-4552-85b4-a346dffcf697 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 07:14:35 compute-0 nova_compute[189265]: 2025-09-30 07:14:35.563 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:14:35 compute-0 nova_compute[189265]: 2025-09-30 07:14:35.564 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:14:34 up  1:12,  0 user,  load average: 0.16, 0.26, 0.45\n', 'num_instances': '1', 'num_vm_building': '1', 'num_task_spawning': '1', 'num_os_type_None': '1', 'num_proj_1413b21c2db845e58d8a81f524a55f3a': '1', 'io_workload': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:14:35 compute-0 nova_compute[189265]: 2025-09-30 07:14:35.635 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Refreshing inventories for resource provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Sep 30 07:14:35 compute-0 nova_compute[189265]: 2025-09-30 07:14:35.702 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Updating ProviderTree inventory for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Sep 30 07:14:35 compute-0 nova_compute[189265]: 2025-09-30 07:14:35.702 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Updating inventory in ProviderTree for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Sep 30 07:14:35 compute-0 nova_compute[189265]: 2025-09-30 07:14:35.717 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Refreshing aggregate associations for resource provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Sep 30 07:14:35 compute-0 nova_compute[189265]: 2025-09-30 07:14:35.749 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Refreshing trait associations for resource provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc, traits: COMPUTE_SECURITY_TPM_CRB,HW_ARCH_X86_64,HW_CPU_X86_F16C,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AESNI,COMPUTE_STORAGE_VIRTIO_FS,HW_CPU_X86_FMA3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE,COMPUTE_ACCELERATORS,HW_CPU_X86_SVM,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_EXTEND,COMPUTE_ARCH_X86_64,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SHA,HW_CPU_X86_BMI,COMPUTE_SOUND_MODEL_USB,COMPUTE_SOUND_MODEL_SB16,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AMD_SVM,HW_CPU_X86_BMI2,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_TIS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_AVX,COMPUTE_SOUND_MODEL_AC97,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_ABM,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_VIF_MODEL_IGB,HW_CPU_X86_SSE41,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SOUND_MODEL_ICH6,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_MMX,HW_CPU_X86_SSE4A,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_PCSPK,HW_CPU_X86_CLMUL _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Sep 30 07:14:35 compute-0 nova_compute[189265]: 2025-09-30 07:14:35.797 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:14:36 compute-0 nova_compute[189265]: 2025-09-30 07:14:36.305 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:14:36 compute-0 nova_compute[189265]: 2025-09-30 07:14:36.405 2 DEBUG oslo_utils.imageutils.format_inspector [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'QFI\xfb') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:14:36 compute-0 nova_compute[189265]: 2025-09-30 07:14:36.408 2 DEBUG oslo_utils.imageutils.format_inspector [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:14:36 compute-0 nova_compute[189265]: 2025-09-30 07:14:36.408 2 DEBUG oslo_concurrency.processutils [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d.part --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:14:36 compute-0 nova_compute[189265]: 2025-09-30 07:14:36.473 2 DEBUG oslo_concurrency.processutils [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d.part --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:14:36 compute-0 nova_compute[189265]: 2025-09-30 07:14:36.474 2 DEBUG nova.virt.images [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] 0c6b92f5-9861-49e4-862d-3ffd84520dfa was qcow2, converting to raw fetch_to_raw /usr/lib/python3.12/site-packages/nova/virt/images.py:278
Sep 30 07:14:36 compute-0 nova_compute[189265]: 2025-09-30 07:14:36.475 2 DEBUG nova.privsep.utils [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.12/site-packages/nova/privsep/utils.py:63
Sep 30 07:14:36 compute-0 nova_compute[189265]: 2025-09-30 07:14:36.475 2 DEBUG oslo_concurrency.processutils [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d.part /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d.converted execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:14:36 compute-0 nova_compute[189265]: 2025-09-30 07:14:36.822 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:14:36 compute-0 nova_compute[189265]: 2025-09-30 07:14:36.823 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.306s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:14:36 compute-0 nova_compute[189265]: 2025-09-30 07:14:36.903 2 DEBUG oslo_concurrency.processutils [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d.part /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d.converted" returned: 0 in 0.427s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:14:36 compute-0 nova_compute[189265]: 2025-09-30 07:14:36.907 2 DEBUG oslo_concurrency.processutils [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d.converted --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:14:36 compute-0 nova_compute[189265]: 2025-09-30 07:14:36.975 2 DEBUG oslo_concurrency.processutils [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d.converted --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:14:36 compute-0 nova_compute[189265]: 2025-09-30 07:14:36.977 2 DEBUG oslo_concurrency.lockutils [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "649c128805005f3dfb5a93843c58a367cdfe939d" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.957s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:14:36 compute-0 nova_compute[189265]: 2025-09-30 07:14:36.978 2 DEBUG oslo_utils.imageutils.format_inspector [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:14:36 compute-0 nova_compute[189265]: 2025-09-30 07:14:36.984 2 DEBUG oslo_utils.imageutils.format_inspector [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:14:36 compute-0 nova_compute[189265]: 2025-09-30 07:14:36.985 2 INFO oslo.privsep.daemon [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpvp6qrnkx/privsep.sock']
Sep 30 07:14:37 compute-0 podman[212565]: 2025-09-30 07:14:37.467999762 +0000 UTC m=+0.053855154 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_managed=true)
Sep 30 07:14:37 compute-0 nova_compute[189265]: 2025-09-30 07:14:37.658 2 INFO oslo.privsep.daemon [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Spawned new privsep daemon via rootwrap
Sep 30 07:14:37 compute-0 nova_compute[189265]: 2025-09-30 07:14:37.524 69 INFO oslo.privsep.daemon [-] privsep daemon starting
Sep 30 07:14:37 compute-0 nova_compute[189265]: 2025-09-30 07:14:37.528 69 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Sep 30 07:14:37 compute-0 nova_compute[189265]: 2025-09-30 07:14:37.530 69 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Sep 30 07:14:37 compute-0 nova_compute[189265]: 2025-09-30 07:14:37.530 69 INFO oslo.privsep.daemon [-] privsep daemon running as pid 69
Sep 30 07:14:37 compute-0 nova_compute[189265]: 2025-09-30 07:14:37.730 2 DEBUG nova.network.neutron [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Successfully updated port: 5e18274a-8ca2-4391-88b8-e5a90d72fc7c _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Sep 30 07:14:37 compute-0 nova_compute[189265]: 2025-09-30 07:14:37.732 2 DEBUG oslo_concurrency.processutils [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:14:37 compute-0 nova_compute[189265]: 2025-09-30 07:14:37.779 2 DEBUG oslo_concurrency.processutils [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:14:37 compute-0 nova_compute[189265]: 2025-09-30 07:14:37.780 2 DEBUG oslo_concurrency.lockutils [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Acquiring lock "649c128805005f3dfb5a93843c58a367cdfe939d" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:14:37 compute-0 nova_compute[189265]: 2025-09-30 07:14:37.780 2 DEBUG oslo_concurrency.lockutils [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "649c128805005f3dfb5a93843c58a367cdfe939d" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:14:37 compute-0 nova_compute[189265]: 2025-09-30 07:14:37.781 2 DEBUG oslo_utils.imageutils.format_inspector [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:14:37 compute-0 nova_compute[189265]: 2025-09-30 07:14:37.783 2 DEBUG oslo_utils.imageutils.format_inspector [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:14:37 compute-0 nova_compute[189265]: 2025-09-30 07:14:37.784 2 DEBUG oslo_concurrency.processutils [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:14:37 compute-0 nova_compute[189265]: 2025-09-30 07:14:37.830 2 DEBUG oslo_concurrency.processutils [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:14:37 compute-0 nova_compute[189265]: 2025-09-30 07:14:37.830 2 DEBUG oslo_concurrency.processutils [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d,backing_fmt=raw /var/lib/nova/instances/9fa193fb-a398-4552-85b4-a346dffcf697/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:14:37 compute-0 nova_compute[189265]: 2025-09-30 07:14:37.856 2 DEBUG oslo_concurrency.processutils [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d,backing_fmt=raw /var/lib/nova/instances/9fa193fb-a398-4552-85b4-a346dffcf697/disk 1073741824" returned: 0 in 0.025s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:14:37 compute-0 nova_compute[189265]: 2025-09-30 07:14:37.856 2 DEBUG oslo_concurrency.lockutils [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "649c128805005f3dfb5a93843c58a367cdfe939d" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.076s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:14:37 compute-0 nova_compute[189265]: 2025-09-30 07:14:37.857 2 DEBUG oslo_concurrency.processutils [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:14:37 compute-0 nova_compute[189265]: 2025-09-30 07:14:37.868 2 DEBUG nova.compute.manager [req-463772a2-4f5f-4171-b1dc-d059f3f38bf4 req-0c1127ed-1a02-489c-8bd1-b47c784c241d 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Received event network-changed-5e18274a-8ca2-4391-88b8-e5a90d72fc7c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:14:37 compute-0 nova_compute[189265]: 2025-09-30 07:14:37.869 2 DEBUG nova.compute.manager [req-463772a2-4f5f-4171-b1dc-d059f3f38bf4 req-0c1127ed-1a02-489c-8bd1-b47c784c241d 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Refreshing instance network info cache due to event network-changed-5e18274a-8ca2-4391-88b8-e5a90d72fc7c. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 07:14:37 compute-0 nova_compute[189265]: 2025-09-30 07:14:37.869 2 DEBUG oslo_concurrency.lockutils [req-463772a2-4f5f-4171-b1dc-d059f3f38bf4 req-0c1127ed-1a02-489c-8bd1-b47c784c241d 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "refresh_cache-9fa193fb-a398-4552-85b4-a346dffcf697" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:14:37 compute-0 nova_compute[189265]: 2025-09-30 07:14:37.869 2 DEBUG oslo_concurrency.lockutils [req-463772a2-4f5f-4171-b1dc-d059f3f38bf4 req-0c1127ed-1a02-489c-8bd1-b47c784c241d 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquired lock "refresh_cache-9fa193fb-a398-4552-85b4-a346dffcf697" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:14:37 compute-0 nova_compute[189265]: 2025-09-30 07:14:37.869 2 DEBUG nova.network.neutron [req-463772a2-4f5f-4171-b1dc-d059f3f38bf4 req-0c1127ed-1a02-489c-8bd1-b47c784c241d 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Refreshing network info cache for port 5e18274a-8ca2-4391-88b8-e5a90d72fc7c _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 07:14:37 compute-0 nova_compute[189265]: 2025-09-30 07:14:37.905 2 DEBUG oslo_concurrency.processutils [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:14:37 compute-0 nova_compute[189265]: 2025-09-30 07:14:37.906 2 DEBUG nova.virt.disk.api [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Checking if we can resize image /var/lib/nova/instances/9fa193fb-a398-4552-85b4-a346dffcf697/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Sep 30 07:14:37 compute-0 nova_compute[189265]: 2025-09-30 07:14:37.906 2 DEBUG oslo_concurrency.processutils [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9fa193fb-a398-4552-85b4-a346dffcf697/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:14:37 compute-0 nova_compute[189265]: 2025-09-30 07:14:37.952 2 DEBUG oslo_concurrency.processutils [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9fa193fb-a398-4552-85b4-a346dffcf697/disk --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:14:37 compute-0 nova_compute[189265]: 2025-09-30 07:14:37.953 2 DEBUG nova.virt.disk.api [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Cannot resize image /var/lib/nova/instances/9fa193fb-a398-4552-85b4-a346dffcf697/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Sep 30 07:14:37 compute-0 nova_compute[189265]: 2025-09-30 07:14:37.953 2 DEBUG nova.virt.libvirt.driver [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Sep 30 07:14:37 compute-0 nova_compute[189265]: 2025-09-30 07:14:37.954 2 DEBUG nova.virt.libvirt.driver [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Ensure instance console log exists: /var/lib/nova/instances/9fa193fb-a398-4552-85b4-a346dffcf697/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 07:14:37 compute-0 nova_compute[189265]: 2025-09-30 07:14:37.954 2 DEBUG oslo_concurrency.lockutils [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:14:37 compute-0 nova_compute[189265]: 2025-09-30 07:14:37.954 2 DEBUG oslo_concurrency.lockutils [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:14:37 compute-0 nova_compute[189265]: 2025-09-30 07:14:37.954 2 DEBUG oslo_concurrency.lockutils [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:14:38 compute-0 nova_compute[189265]: 2025-09-30 07:14:38.254 2 DEBUG oslo_concurrency.lockutils [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Acquiring lock "refresh_cache-9fa193fb-a398-4552-85b4-a346dffcf697" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:14:38 compute-0 nova_compute[189265]: 2025-09-30 07:14:38.446 2 WARNING neutronclient.v2_0.client [req-463772a2-4f5f-4171-b1dc-d059f3f38bf4 req-0c1127ed-1a02-489c-8bd1-b47c784c241d 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:14:39 compute-0 nova_compute[189265]: 2025-09-30 07:14:39.117 2 DEBUG nova.network.neutron [req-463772a2-4f5f-4171-b1dc-d059f3f38bf4 req-0c1127ed-1a02-489c-8bd1-b47c784c241d 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 07:14:39 compute-0 nova_compute[189265]: 2025-09-30 07:14:39.292 2 DEBUG nova.network.neutron [req-463772a2-4f5f-4171-b1dc-d059f3f38bf4 req-0c1127ed-1a02-489c-8bd1-b47c784c241d 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:14:39 compute-0 nova_compute[189265]: 2025-09-30 07:14:39.810 2 DEBUG oslo_concurrency.lockutils [req-463772a2-4f5f-4171-b1dc-d059f3f38bf4 req-0c1127ed-1a02-489c-8bd1-b47c784c241d 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Releasing lock "refresh_cache-9fa193fb-a398-4552-85b4-a346dffcf697" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:14:39 compute-0 nova_compute[189265]: 2025-09-30 07:14:39.811 2 DEBUG oslo_concurrency.lockutils [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Acquired lock "refresh_cache-9fa193fb-a398-4552-85b4-a346dffcf697" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:14:39 compute-0 nova_compute[189265]: 2025-09-30 07:14:39.812 2 DEBUG nova.network.neutron [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 07:14:39 compute-0 nova_compute[189265]: 2025-09-30 07:14:39.822 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:14:40 compute-0 nova_compute[189265]: 2025-09-30 07:14:40.828 2 DEBUG nova.network.neutron [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 07:14:41 compute-0 nova_compute[189265]: 2025-09-30 07:14:41.104 2 WARNING neutronclient.v2_0.client [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:14:41 compute-0 podman[212604]: 2025-09-30 07:14:41.5344907 +0000 UTC m=+0.110805157 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, version=9.6, config_id=edpm, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc.)
Sep 30 07:14:42 compute-0 nova_compute[189265]: 2025-09-30 07:14:42.120 2 DEBUG nova.network.neutron [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Updating instance_info_cache with network_info: [{"id": "5e18274a-8ca2-4391-88b8-e5a90d72fc7c", "address": "fa:16:3e:0d:a8:de", "network": {"id": "74ffbf65-ebbd-4587-bf5b-0b38421a4813", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1315246804-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1dc2a906d2242f79ffab81c2cf3c4d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e18274a-8c", "ovs_interfaceid": "5e18274a-8ca2-4391-88b8-e5a90d72fc7c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:14:42 compute-0 nova_compute[189265]: 2025-09-30 07:14:42.629 2 DEBUG oslo_concurrency.lockutils [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Releasing lock "refresh_cache-9fa193fb-a398-4552-85b4-a346dffcf697" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:14:42 compute-0 nova_compute[189265]: 2025-09-30 07:14:42.629 2 DEBUG nova.compute.manager [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Instance network_info: |[{"id": "5e18274a-8ca2-4391-88b8-e5a90d72fc7c", "address": "fa:16:3e:0d:a8:de", "network": {"id": "74ffbf65-ebbd-4587-bf5b-0b38421a4813", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1315246804-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1dc2a906d2242f79ffab81c2cf3c4d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e18274a-8c", "ovs_interfaceid": "5e18274a-8ca2-4391-88b8-e5a90d72fc7c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Sep 30 07:14:42 compute-0 nova_compute[189265]: 2025-09-30 07:14:42.635 2 DEBUG nova.virt.libvirt.driver [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Start _get_guest_xml network_info=[{"id": "5e18274a-8ca2-4391-88b8-e5a90d72fc7c", "address": "fa:16:3e:0d:a8:de", "network": {"id": "74ffbf65-ebbd-4587-bf5b-0b38421a4813", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1315246804-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1dc2a906d2242f79ffab81c2cf3c4d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e18274a-8c", "ovs_interfaceid": "5e18274a-8ca2-4391-88b8-e5a90d72fc7c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T07:07:59Z,direct_url=<?>,disk_format='qcow2',id=0c6b92f5-9861-49e4-862d-3ffd84520dfa,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4049964ce8244dacb50493f6676c6613',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T07:08:00Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'encryption_format': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encrypted': False, 'image_id': '0c6b92f5-9861-49e4-862d-3ffd84520dfa'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Sep 30 07:14:42 compute-0 nova_compute[189265]: 2025-09-30 07:14:42.641 2 WARNING nova.virt.libvirt.driver [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:14:42 compute-0 nova_compute[189265]: 2025-09-30 07:14:42.644 2 DEBUG nova.virt.driver [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='0c6b92f5-9861-49e4-862d-3ffd84520dfa', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteActionsViaActuator-server-1895073706', uuid='9fa193fb-a398-4552-85b4-a346dffcf697'), owner=OwnerMeta(userid='d6cb6be5d6fc407eb3abc1c7c70f5d77', username='tempest-TestExecuteActionsViaActuator-2061885601-project-admin', projectid='1413b21c2db845e58d8a81f524a55f3a', projectname='tempest-TestExecuteActionsViaActuator-2061885601'), image=ImageMeta(id='0c6b92f5-9861-49e4-862d-3ffd84520dfa', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='ded17455-f8fe-40c7-8dae-6f0a2b208ae0', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "5e18274a-8ca2-4391-88b8-e5a90d72fc7c", "address": "fa:16:3e:0d:a8:de", "network": {"id": "74ffbf65-ebbd-4587-bf5b-0b38421a4813", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1315246804-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1dc2a906d2242f79ffab81c2cf3c4d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e18274a-8c", "ovs_interfaceid": "5e18274a-8ca2-4391-88b8-e5a90d72fc7c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759216482.6443195) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Sep 30 07:14:42 compute-0 nova_compute[189265]: 2025-09-30 07:14:42.650 2 DEBUG nova.virt.libvirt.host [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Sep 30 07:14:42 compute-0 nova_compute[189265]: 2025-09-30 07:14:42.651 2 DEBUG nova.virt.libvirt.host [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Sep 30 07:14:42 compute-0 nova_compute[189265]: 2025-09-30 07:14:42.654 2 DEBUG nova.virt.libvirt.host [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Sep 30 07:14:42 compute-0 nova_compute[189265]: 2025-09-30 07:14:42.655 2 DEBUG nova.virt.libvirt.host [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Sep 30 07:14:42 compute-0 nova_compute[189265]: 2025-09-30 07:14:42.657 2 DEBUG nova.virt.libvirt.driver [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 07:14:42 compute-0 nova_compute[189265]: 2025-09-30 07:14:42.657 2 DEBUG nova.virt.hardware [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T07:07:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='ded17455-f8fe-40c7-8dae-6f0a2b208ae0',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T07:07:59Z,direct_url=<?>,disk_format='qcow2',id=0c6b92f5-9861-49e4-862d-3ffd84520dfa,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4049964ce8244dacb50493f6676c6613',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T07:08:00Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Sep 30 07:14:42 compute-0 nova_compute[189265]: 2025-09-30 07:14:42.658 2 DEBUG nova.virt.hardware [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Sep 30 07:14:42 compute-0 nova_compute[189265]: 2025-09-30 07:14:42.659 2 DEBUG nova.virt.hardware [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Sep 30 07:14:42 compute-0 nova_compute[189265]: 2025-09-30 07:14:42.660 2 DEBUG nova.virt.hardware [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Sep 30 07:14:42 compute-0 nova_compute[189265]: 2025-09-30 07:14:42.660 2 DEBUG nova.virt.hardware [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Sep 30 07:14:42 compute-0 nova_compute[189265]: 2025-09-30 07:14:42.661 2 DEBUG nova.virt.hardware [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Sep 30 07:14:42 compute-0 nova_compute[189265]: 2025-09-30 07:14:42.661 2 DEBUG nova.virt.hardware [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Sep 30 07:14:42 compute-0 nova_compute[189265]: 2025-09-30 07:14:42.662 2 DEBUG nova.virt.hardware [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Sep 30 07:14:42 compute-0 nova_compute[189265]: 2025-09-30 07:14:42.662 2 DEBUG nova.virt.hardware [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Sep 30 07:14:42 compute-0 nova_compute[189265]: 2025-09-30 07:14:42.663 2 DEBUG nova.virt.hardware [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Sep 30 07:14:42 compute-0 nova_compute[189265]: 2025-09-30 07:14:42.663 2 DEBUG nova.virt.hardware [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Sep 30 07:14:42 compute-0 nova_compute[189265]: 2025-09-30 07:14:42.670 2 DEBUG nova.privsep.utils [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.12/site-packages/nova/privsep/utils.py:63
Sep 30 07:14:42 compute-0 nova_compute[189265]: 2025-09-30 07:14:42.672 2 DEBUG nova.virt.libvirt.vif [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T07:14:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1895073706',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1895073706',id=3,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1413b21c2db845e58d8a81f524a55f3a',ramdisk_id='',reservation_id='r-njnn49ef',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-2061885601',owner_user_name='tempest-TestExecuteActionsViaActuator-2061885601-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T07:14:34Z,user_data=None,user_id='d6cb6be5d6fc407eb3abc1c7c70f5d77',uuid=9fa193fb-a398-4552-85b4-a346dffcf697,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5e18274a-8ca2-4391-88b8-e5a90d72fc7c", "address": "fa:16:3e:0d:a8:de", "network": {"id": "74ffbf65-ebbd-4587-bf5b-0b38421a4813", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1315246804-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1dc2a906d2242f79ffab81c2cf3c4d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e18274a-8c", "ovs_interfaceid": "5e18274a-8ca2-4391-88b8-e5a90d72fc7c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 07:14:42 compute-0 nova_compute[189265]: 2025-09-30 07:14:42.672 2 DEBUG nova.network.os_vif_util [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Converting VIF {"id": "5e18274a-8ca2-4391-88b8-e5a90d72fc7c", "address": "fa:16:3e:0d:a8:de", "network": {"id": "74ffbf65-ebbd-4587-bf5b-0b38421a4813", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1315246804-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1dc2a906d2242f79ffab81c2cf3c4d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e18274a-8c", "ovs_interfaceid": "5e18274a-8ca2-4391-88b8-e5a90d72fc7c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:14:42 compute-0 nova_compute[189265]: 2025-09-30 07:14:42.674 2 DEBUG nova.network.os_vif_util [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0d:a8:de,bridge_name='br-int',has_traffic_filtering=True,id=5e18274a-8ca2-4391-88b8-e5a90d72fc7c,network=Network(74ffbf65-ebbd-4587-bf5b-0b38421a4813),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e18274a-8c') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:14:42 compute-0 nova_compute[189265]: 2025-09-30 07:14:42.676 2 DEBUG nova.objects.instance [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lazy-loading 'pci_devices' on Instance uuid 9fa193fb-a398-4552-85b4-a346dffcf697 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:14:43 compute-0 nova_compute[189265]: 2025-09-30 07:14:43.187 2 DEBUG nova.virt.libvirt.driver [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] End _get_guest_xml xml=<domain type="kvm">
Sep 30 07:14:43 compute-0 nova_compute[189265]:   <uuid>9fa193fb-a398-4552-85b4-a346dffcf697</uuid>
Sep 30 07:14:43 compute-0 nova_compute[189265]:   <name>instance-00000003</name>
Sep 30 07:14:43 compute-0 nova_compute[189265]:   <memory>131072</memory>
Sep 30 07:14:43 compute-0 nova_compute[189265]:   <vcpu>1</vcpu>
Sep 30 07:14:43 compute-0 nova_compute[189265]:   <metadata>
Sep 30 07:14:43 compute-0 nova_compute[189265]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 07:14:43 compute-0 nova_compute[189265]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-1895073706</nova:name>
Sep 30 07:14:43 compute-0 nova_compute[189265]:       <nova:creationTime>2025-09-30 07:14:42</nova:creationTime>
Sep 30 07:14:43 compute-0 nova_compute[189265]:       <nova:flavor name="m1.nano" id="ded17455-f8fe-40c7-8dae-6f0a2b208ae0">
Sep 30 07:14:43 compute-0 nova_compute[189265]:         <nova:memory>128</nova:memory>
Sep 30 07:14:43 compute-0 nova_compute[189265]:         <nova:disk>1</nova:disk>
Sep 30 07:14:43 compute-0 nova_compute[189265]:         <nova:swap>0</nova:swap>
Sep 30 07:14:43 compute-0 nova_compute[189265]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 07:14:43 compute-0 nova_compute[189265]:         <nova:vcpus>1</nova:vcpus>
Sep 30 07:14:43 compute-0 nova_compute[189265]:         <nova:extraSpecs>
Sep 30 07:14:43 compute-0 nova_compute[189265]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 07:14:43 compute-0 nova_compute[189265]:         </nova:extraSpecs>
Sep 30 07:14:43 compute-0 nova_compute[189265]:       </nova:flavor>
Sep 30 07:14:43 compute-0 nova_compute[189265]:       <nova:image uuid="0c6b92f5-9861-49e4-862d-3ffd84520dfa">
Sep 30 07:14:43 compute-0 nova_compute[189265]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 07:14:43 compute-0 nova_compute[189265]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 07:14:43 compute-0 nova_compute[189265]:         <nova:minDisk>1</nova:minDisk>
Sep 30 07:14:43 compute-0 nova_compute[189265]:         <nova:minRam>0</nova:minRam>
Sep 30 07:14:43 compute-0 nova_compute[189265]:         <nova:properties>
Sep 30 07:14:43 compute-0 nova_compute[189265]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 07:14:43 compute-0 nova_compute[189265]:         </nova:properties>
Sep 30 07:14:43 compute-0 nova_compute[189265]:       </nova:image>
Sep 30 07:14:43 compute-0 nova_compute[189265]:       <nova:owner>
Sep 30 07:14:43 compute-0 nova_compute[189265]:         <nova:user uuid="d6cb6be5d6fc407eb3abc1c7c70f5d77">tempest-TestExecuteActionsViaActuator-2061885601-project-admin</nova:user>
Sep 30 07:14:43 compute-0 nova_compute[189265]:         <nova:project uuid="1413b21c2db845e58d8a81f524a55f3a">tempest-TestExecuteActionsViaActuator-2061885601</nova:project>
Sep 30 07:14:43 compute-0 nova_compute[189265]:       </nova:owner>
Sep 30 07:14:43 compute-0 nova_compute[189265]:       <nova:root type="image" uuid="0c6b92f5-9861-49e4-862d-3ffd84520dfa"/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:       <nova:ports>
Sep 30 07:14:43 compute-0 nova_compute[189265]:         <nova:port uuid="5e18274a-8ca2-4391-88b8-e5a90d72fc7c">
Sep 30 07:14:43 compute-0 nova_compute[189265]:           <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:         </nova:port>
Sep 30 07:14:43 compute-0 nova_compute[189265]:       </nova:ports>
Sep 30 07:14:43 compute-0 nova_compute[189265]:     </nova:instance>
Sep 30 07:14:43 compute-0 nova_compute[189265]:   </metadata>
Sep 30 07:14:43 compute-0 nova_compute[189265]:   <sysinfo type="smbios">
Sep 30 07:14:43 compute-0 nova_compute[189265]:     <system>
Sep 30 07:14:43 compute-0 nova_compute[189265]:       <entry name="manufacturer">RDO</entry>
Sep 30 07:14:43 compute-0 nova_compute[189265]:       <entry name="product">OpenStack Compute</entry>
Sep 30 07:14:43 compute-0 nova_compute[189265]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 07:14:43 compute-0 nova_compute[189265]:       <entry name="serial">9fa193fb-a398-4552-85b4-a346dffcf697</entry>
Sep 30 07:14:43 compute-0 nova_compute[189265]:       <entry name="uuid">9fa193fb-a398-4552-85b4-a346dffcf697</entry>
Sep 30 07:14:43 compute-0 nova_compute[189265]:       <entry name="family">Virtual Machine</entry>
Sep 30 07:14:43 compute-0 nova_compute[189265]:     </system>
Sep 30 07:14:43 compute-0 nova_compute[189265]:   </sysinfo>
Sep 30 07:14:43 compute-0 nova_compute[189265]:   <os>
Sep 30 07:14:43 compute-0 nova_compute[189265]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 07:14:43 compute-0 nova_compute[189265]:     <boot dev="hd"/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:     <smbios mode="sysinfo"/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:   </os>
Sep 30 07:14:43 compute-0 nova_compute[189265]:   <features>
Sep 30 07:14:43 compute-0 nova_compute[189265]:     <acpi/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:     <apic/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:     <vmcoreinfo/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:   </features>
Sep 30 07:14:43 compute-0 nova_compute[189265]:   <clock offset="utc">
Sep 30 07:14:43 compute-0 nova_compute[189265]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:     <timer name="hpet" present="no"/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:   </clock>
Sep 30 07:14:43 compute-0 nova_compute[189265]:   <cpu mode="host-model" match="exact">
Sep 30 07:14:43 compute-0 nova_compute[189265]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:   </cpu>
Sep 30 07:14:43 compute-0 nova_compute[189265]:   <devices>
Sep 30 07:14:43 compute-0 nova_compute[189265]:     <disk type="file" device="disk">
Sep 30 07:14:43 compute-0 nova_compute[189265]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:       <source file="/var/lib/nova/instances/9fa193fb-a398-4552-85b4-a346dffcf697/disk"/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:       <target dev="vda" bus="virtio"/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:     </disk>
Sep 30 07:14:43 compute-0 nova_compute[189265]:     <disk type="file" device="cdrom">
Sep 30 07:14:43 compute-0 nova_compute[189265]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:       <source file="/var/lib/nova/instances/9fa193fb-a398-4552-85b4-a346dffcf697/disk.config"/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:       <target dev="sda" bus="sata"/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:     </disk>
Sep 30 07:14:43 compute-0 nova_compute[189265]:     <interface type="ethernet">
Sep 30 07:14:43 compute-0 nova_compute[189265]:       <mac address="fa:16:3e:0d:a8:de"/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:       <model type="virtio"/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:       <mtu size="1442"/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:       <target dev="tap5e18274a-8c"/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:     </interface>
Sep 30 07:14:43 compute-0 nova_compute[189265]:     <serial type="pty">
Sep 30 07:14:43 compute-0 nova_compute[189265]:       <log file="/var/lib/nova/instances/9fa193fb-a398-4552-85b4-a346dffcf697/console.log" append="off"/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:     </serial>
Sep 30 07:14:43 compute-0 nova_compute[189265]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:     <video>
Sep 30 07:14:43 compute-0 nova_compute[189265]:       <model type="virtio"/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:     </video>
Sep 30 07:14:43 compute-0 nova_compute[189265]:     <input type="tablet" bus="usb"/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:     <rng model="virtio">
Sep 30 07:14:43 compute-0 nova_compute[189265]:       <backend model="random">/dev/urandom</backend>
Sep 30 07:14:43 compute-0 nova_compute[189265]:     </rng>
Sep 30 07:14:43 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root"/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:     <controller type="usb" index="0"/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 07:14:43 compute-0 nova_compute[189265]:       <stats period="10"/>
Sep 30 07:14:43 compute-0 nova_compute[189265]:     </memballoon>
Sep 30 07:14:43 compute-0 nova_compute[189265]:   </devices>
Sep 30 07:14:43 compute-0 nova_compute[189265]: </domain>
Sep 30 07:14:43 compute-0 nova_compute[189265]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Sep 30 07:14:43 compute-0 nova_compute[189265]: 2025-09-30 07:14:43.189 2 DEBUG nova.compute.manager [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Preparing to wait for external event network-vif-plugged-5e18274a-8ca2-4391-88b8-e5a90d72fc7c prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 07:14:43 compute-0 nova_compute[189265]: 2025-09-30 07:14:43.189 2 DEBUG oslo_concurrency.lockutils [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Acquiring lock "9fa193fb-a398-4552-85b4-a346dffcf697-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:14:43 compute-0 nova_compute[189265]: 2025-09-30 07:14:43.190 2 DEBUG oslo_concurrency.lockutils [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "9fa193fb-a398-4552-85b4-a346dffcf697-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:14:43 compute-0 nova_compute[189265]: 2025-09-30 07:14:43.190 2 DEBUG oslo_concurrency.lockutils [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "9fa193fb-a398-4552-85b4-a346dffcf697-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:14:43 compute-0 nova_compute[189265]: 2025-09-30 07:14:43.191 2 DEBUG nova.virt.libvirt.vif [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T07:14:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1895073706',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1895073706',id=3,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1413b21c2db845e58d8a81f524a55f3a',ramdisk_id='',reservation_id='r-njnn49ef',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-2061885601',owner_user_name='tempest-TestExecuteActionsViaActuator-2061885601-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T07:14:34Z,user_data=None,user_id='d6cb6be5d6fc407eb3abc1c7c70f5d77',uuid=9fa193fb-a398-4552-85b4-a346dffcf697,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5e18274a-8ca2-4391-88b8-e5a90d72fc7c", "address": "fa:16:3e:0d:a8:de", "network": {"id": "74ffbf65-ebbd-4587-bf5b-0b38421a4813", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1315246804-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1dc2a906d2242f79ffab81c2cf3c4d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e18274a-8c", "ovs_interfaceid": "5e18274a-8ca2-4391-88b8-e5a90d72fc7c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 07:14:43 compute-0 nova_compute[189265]: 2025-09-30 07:14:43.192 2 DEBUG nova.network.os_vif_util [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Converting VIF {"id": "5e18274a-8ca2-4391-88b8-e5a90d72fc7c", "address": "fa:16:3e:0d:a8:de", "network": {"id": "74ffbf65-ebbd-4587-bf5b-0b38421a4813", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1315246804-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1dc2a906d2242f79ffab81c2cf3c4d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e18274a-8c", "ovs_interfaceid": "5e18274a-8ca2-4391-88b8-e5a90d72fc7c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:14:43 compute-0 nova_compute[189265]: 2025-09-30 07:14:43.193 2 DEBUG nova.network.os_vif_util [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0d:a8:de,bridge_name='br-int',has_traffic_filtering=True,id=5e18274a-8ca2-4391-88b8-e5a90d72fc7c,network=Network(74ffbf65-ebbd-4587-bf5b-0b38421a4813),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e18274a-8c') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:14:43 compute-0 nova_compute[189265]: 2025-09-30 07:14:43.194 2 DEBUG os_vif [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0d:a8:de,bridge_name='br-int',has_traffic_filtering=True,id=5e18274a-8ca2-4391-88b8-e5a90d72fc7c,network=Network(74ffbf65-ebbd-4587-bf5b-0b38421a4813),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e18274a-8c') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 07:14:43 compute-0 nova_compute[189265]: 2025-09-30 07:14:43.253 2 DEBUG ovsdbapp.backend.ovs_idl [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Sep 30 07:14:43 compute-0 nova_compute[189265]: 2025-09-30 07:14:43.253 2 DEBUG ovsdbapp.backend.ovs_idl [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Sep 30 07:14:43 compute-0 nova_compute[189265]: 2025-09-30 07:14:43.253 2 DEBUG ovsdbapp.backend.ovs_idl [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Sep 30 07:14:43 compute-0 nova_compute[189265]: 2025-09-30 07:14:43.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 07:14:43 compute-0 nova_compute[189265]: 2025-09-30 07:14:43.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [POLLOUT] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:14:43 compute-0 nova_compute[189265]: 2025-09-30 07:14:43.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 07:14:43 compute-0 nova_compute[189265]: 2025-09-30 07:14:43.255 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:14:43 compute-0 nova_compute[189265]: 2025-09-30 07:14:43.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:14:43 compute-0 nova_compute[189265]: 2025-09-30 07:14:43.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:14:43 compute-0 nova_compute[189265]: 2025-09-30 07:14:43.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:14:43 compute-0 nova_compute[189265]: 2025-09-30 07:14:43.267 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:14:43 compute-0 nova_compute[189265]: 2025-09-30 07:14:43.267 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:14:43 compute-0 nova_compute[189265]: 2025-09-30 07:14:43.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:14:43 compute-0 nova_compute[189265]: 2025-09-30 07:14:43.268 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '3459b949-0088-50d2-a26d-bc6705cae26c', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:14:43 compute-0 nova_compute[189265]: 2025-09-30 07:14:43.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:14:43 compute-0 nova_compute[189265]: 2025-09-30 07:14:43.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:14:43 compute-0 nova_compute[189265]: 2025-09-30 07:14:43.272 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:14:43 compute-0 nova_compute[189265]: 2025-09-30 07:14:43.273 2 INFO oslo.privsep.daemon [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpj4ikds4k/privsep.sock']
Sep 30 07:14:43 compute-0 podman[212628]: 2025-09-30 07:14:43.5105466 +0000 UTC m=+0.089294066 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=multipathd, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 07:14:43 compute-0 podman[212629]: 2025-09-30 07:14:43.564779144 +0000 UTC m=+0.136572649 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4)
Sep 30 07:14:44 compute-0 nova_compute[189265]: 2025-09-30 07:14:44.072 2 INFO oslo.privsep.daemon [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Spawned new privsep daemon via rootwrap
Sep 30 07:14:44 compute-0 nova_compute[189265]: 2025-09-30 07:14:43.903 90 INFO oslo.privsep.daemon [-] privsep daemon starting
Sep 30 07:14:44 compute-0 nova_compute[189265]: 2025-09-30 07:14:43.911 90 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Sep 30 07:14:44 compute-0 nova_compute[189265]: 2025-09-30 07:14:43.914 90 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Sep 30 07:14:44 compute-0 nova_compute[189265]: 2025-09-30 07:14:43.915 90 INFO oslo.privsep.daemon [-] privsep daemon running as pid 90
Sep 30 07:14:44 compute-0 nova_compute[189265]: 2025-09-30 07:14:44.336 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:14:44 compute-0 nova_compute[189265]: 2025-09-30 07:14:44.337 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5e18274a-8c, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:14:44 compute-0 nova_compute[189265]: 2025-09-30 07:14:44.338 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap5e18274a-8c, col_values=(('qos', UUID('09c38c36-3856-4f0b-a8c2-35037b1cfe51')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:14:44 compute-0 nova_compute[189265]: 2025-09-30 07:14:44.339 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap5e18274a-8c, col_values=(('external_ids', {'iface-id': '5e18274a-8ca2-4391-88b8-e5a90d72fc7c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0d:a8:de', 'vm-uuid': '9fa193fb-a398-4552-85b4-a346dffcf697'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:14:44 compute-0 nova_compute[189265]: 2025-09-30 07:14:44.341 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:14:44 compute-0 NetworkManager[51813]: <info>  [1759216484.3425] manager: (tap5e18274a-8c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/22)
Sep 30 07:14:44 compute-0 nova_compute[189265]: 2025-09-30 07:14:44.344 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:14:44 compute-0 nova_compute[189265]: 2025-09-30 07:14:44.349 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:14:44 compute-0 nova_compute[189265]: 2025-09-30 07:14:44.350 2 INFO os_vif [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0d:a8:de,bridge_name='br-int',has_traffic_filtering=True,id=5e18274a-8ca2-4391-88b8-e5a90d72fc7c,network=Network(74ffbf65-ebbd-4587-bf5b-0b38421a4813),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e18274a-8c')
Sep 30 07:14:45 compute-0 podman[212679]: 2025-09-30 07:14:45.502800387 +0000 UTC m=+0.076655692 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Sep 30 07:14:45 compute-0 nova_compute[189265]: 2025-09-30 07:14:45.914 2 DEBUG nova.virt.libvirt.driver [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 07:14:45 compute-0 nova_compute[189265]: 2025-09-30 07:14:45.915 2 DEBUG nova.virt.libvirt.driver [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 07:14:45 compute-0 nova_compute[189265]: 2025-09-30 07:14:45.915 2 DEBUG nova.virt.libvirt.driver [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] No VIF found with MAC fa:16:3e:0d:a8:de, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 07:14:45 compute-0 nova_compute[189265]: 2025-09-30 07:14:45.916 2 INFO nova.virt.libvirt.driver [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Using config drive
Sep 30 07:14:46 compute-0 nova_compute[189265]: 2025-09-30 07:14:46.426 2 WARNING neutronclient.v2_0.client [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:14:46 compute-0 nova_compute[189265]: 2025-09-30 07:14:46.675 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:14:47 compute-0 nova_compute[189265]: 2025-09-30 07:14:47.191 2 INFO nova.virt.libvirt.driver [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Creating config drive at /var/lib/nova/instances/9fa193fb-a398-4552-85b4-a346dffcf697/disk.config
Sep 30 07:14:47 compute-0 nova_compute[189265]: 2025-09-30 07:14:47.197 2 DEBUG oslo_concurrency.processutils [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9fa193fb-a398-4552-85b4-a346dffcf697/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmp4qzhghuq execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:14:47 compute-0 nova_compute[189265]: 2025-09-30 07:14:47.339 2 DEBUG oslo_concurrency.processutils [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9fa193fb-a398-4552-85b4-a346dffcf697/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmp4qzhghuq" returned: 0 in 0.142s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:14:47 compute-0 kernel: tun: Universal TUN/TAP device driver, 1.6
Sep 30 07:14:47 compute-0 kernel: tap5e18274a-8c: entered promiscuous mode
Sep 30 07:14:47 compute-0 NetworkManager[51813]: <info>  [1759216487.4216] manager: (tap5e18274a-8c): new Tun device (/org/freedesktop/NetworkManager/Devices/23)
Sep 30 07:14:47 compute-0 ovn_controller[91436]: 2025-09-30T07:14:47Z|00040|binding|INFO|Claiming lport 5e18274a-8ca2-4391-88b8-e5a90d72fc7c for this chassis.
Sep 30 07:14:47 compute-0 ovn_controller[91436]: 2025-09-30T07:14:47Z|00041|binding|INFO|5e18274a-8ca2-4391-88b8-e5a90d72fc7c: Claiming fa:16:3e:0d:a8:de 10.100.0.4
Sep 30 07:14:47 compute-0 nova_compute[189265]: 2025-09-30 07:14:47.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:14:47 compute-0 nova_compute[189265]: 2025-09-30 07:14:47.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:14:47 compute-0 systemd-udevd[212718]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 07:14:47 compute-0 NetworkManager[51813]: <info>  [1759216487.4723] device (tap5e18274a-8c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 07:14:47 compute-0 NetworkManager[51813]: <info>  [1759216487.4740] device (tap5e18274a-8c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 07:14:47 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:14:47.501 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0d:a8:de 10.100.0.4'], port_security=['fa:16:3e:0d:a8:de 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '9fa193fb-a398-4552-85b4-a346dffcf697', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74ffbf65-ebbd-4587-bf5b-0b38421a4813', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1413b21c2db845e58d8a81f524a55f3a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8ad3c6f6-3842-4d69-92ac-cef07b75c3bc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7b541691-433c-426c-b8b7-10d79319603a, chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], logical_port=5e18274a-8ca2-4391-88b8-e5a90d72fc7c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:14:47 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:14:47.503 100322 INFO neutron.agent.ovn.metadata.agent [-] Port 5e18274a-8ca2-4391-88b8-e5a90d72fc7c in datapath 74ffbf65-ebbd-4587-bf5b-0b38421a4813 bound to our chassis
Sep 30 07:14:47 compute-0 systemd-machined[149233]: New machine qemu-1-instance-00000003.
Sep 30 07:14:47 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:14:47.505 100322 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 74ffbf65-ebbd-4587-bf5b-0b38421a4813
Sep 30 07:14:47 compute-0 nova_compute[189265]: 2025-09-30 07:14:47.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:14:47 compute-0 ovn_controller[91436]: 2025-09-30T07:14:47Z|00042|binding|INFO|Setting lport 5e18274a-8ca2-4391-88b8-e5a90d72fc7c ovn-installed in OVS
Sep 30 07:14:47 compute-0 ovn_controller[91436]: 2025-09-30T07:14:47Z|00043|binding|INFO|Setting lport 5e18274a-8ca2-4391-88b8-e5a90d72fc7c up in Southbound
Sep 30 07:14:47 compute-0 nova_compute[189265]: 2025-09-30 07:14:47.513 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:14:47 compute-0 systemd[1]: Started Virtual Machine qemu-1-instance-00000003.
Sep 30 07:14:47 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:14:47.530 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[785e8e2f-6cdd-45b3-8cde-19759b09e236]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:14:47 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:14:47.531 100322 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap74ffbf65-e1 in ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Sep 30 07:14:47 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:14:47.534 210650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap74ffbf65-e0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Sep 30 07:14:47 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:14:47.534 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[27f78540-dc47-4d1d-95bc-678d3f354c66]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:14:47 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:14:47.535 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[7d2312b5-3f42-4dd1-b2fb-6525885a17a0]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:14:47 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:14:47.567 100440 DEBUG oslo.privsep.daemon [-] privsep: reply[a6aeebf7-e2c6-428b-8b25-d80997892d91]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:14:47 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:14:47.577 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[007fe32e-9f21-4b9d-a546-581b0aa1724a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:14:47 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:14:47.579 100322 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmp9url38e9/privsep.sock']
Sep 30 07:14:47 compute-0 nova_compute[189265]: 2025-09-30 07:14:47.747 2 DEBUG nova.compute.manager [req-c9c58209-8022-40a3-b0da-a3dbad7e002e req-b83baa3a-0cb0-453e-9f0d-1b664f61249c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Received event network-vif-plugged-5e18274a-8ca2-4391-88b8-e5a90d72fc7c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:14:47 compute-0 nova_compute[189265]: 2025-09-30 07:14:47.747 2 DEBUG oslo_concurrency.lockutils [req-c9c58209-8022-40a3-b0da-a3dbad7e002e req-b83baa3a-0cb0-453e-9f0d-1b664f61249c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "9fa193fb-a398-4552-85b4-a346dffcf697-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:14:47 compute-0 nova_compute[189265]: 2025-09-30 07:14:47.748 2 DEBUG oslo_concurrency.lockutils [req-c9c58209-8022-40a3-b0da-a3dbad7e002e req-b83baa3a-0cb0-453e-9f0d-1b664f61249c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "9fa193fb-a398-4552-85b4-a346dffcf697-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:14:47 compute-0 nova_compute[189265]: 2025-09-30 07:14:47.748 2 DEBUG oslo_concurrency.lockutils [req-c9c58209-8022-40a3-b0da-a3dbad7e002e req-b83baa3a-0cb0-453e-9f0d-1b664f61249c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "9fa193fb-a398-4552-85b4-a346dffcf697-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:14:47 compute-0 nova_compute[189265]: 2025-09-30 07:14:47.749 2 DEBUG nova.compute.manager [req-c9c58209-8022-40a3-b0da-a3dbad7e002e req-b83baa3a-0cb0-453e-9f0d-1b664f61249c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Processing event network-vif-plugged-5e18274a-8ca2-4391-88b8-e5a90d72fc7c _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 07:14:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:14:48.271 100322 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Sep 30 07:14:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:14:48.272 100322 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp9url38e9/privsep.sock __init__ /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:377
Sep 30 07:14:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:14:48.139 212743 INFO oslo.privsep.daemon [-] privsep daemon starting
Sep 30 07:14:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:14:48.145 212743 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Sep 30 07:14:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:14:48.148 212743 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Sep 30 07:14:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:14:48.148 212743 INFO oslo.privsep.daemon [-] privsep daemon running as pid 212743
Sep 30 07:14:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:14:48.273 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[bda9cb4c-f786-4534-ad90-17e61ba56043]: (2,) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:14:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:14:48.696 212743 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:14:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:14:48.696 212743 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:14:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:14:48.696 212743 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:14:48 compute-0 nova_compute[189265]: 2025-09-30 07:14:48.775 2 DEBUG nova.compute.manager [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 07:14:48 compute-0 nova_compute[189265]: 2025-09-30 07:14:48.789 2 DEBUG nova.virt.libvirt.driver [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Sep 30 07:14:48 compute-0 nova_compute[189265]: 2025-09-30 07:14:48.793 2 INFO nova.virt.libvirt.driver [-] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Instance spawned successfully.
Sep 30 07:14:48 compute-0 nova_compute[189265]: 2025-09-30 07:14:48.794 2 DEBUG nova.virt.libvirt.driver [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:14:49.163 212743 INFO oslo_service.backend [-] Loading backend: eventlet
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:14:49.168 212743 INFO oslo_service.backend [-] Backend 'eventlet' successfully loaded and cached.
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:14:49.245 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[1ec2813f-3d97-4a13-bb13-2cc2deec4676]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:14:49 compute-0 NetworkManager[51813]: <info>  [1759216489.2521] manager: (tap74ffbf65-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/24)
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:14:49.250 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[dab26133-c05c-4f97-a6e5-d4b7bc205563]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:14:49 compute-0 systemd-udevd[212721]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:14:49.282 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[1227d106-2980-4c0f-9a1a-672fe8da2c91]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:14:49.285 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[2c1920bb-7fe0-4651-9146-d50fc5494abf]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:14:49 compute-0 NetworkManager[51813]: <info>  [1759216489.3049] device (tap74ffbf65-e0): carrier: link connected
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:14:49.311 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[26a15689-b9f1-4e40-b5a2-3ef9efaec9bf]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:14:49 compute-0 nova_compute[189265]: 2025-09-30 07:14:49.311 2 DEBUG nova.virt.libvirt.driver [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:14:49 compute-0 nova_compute[189265]: 2025-09-30 07:14:49.311 2 DEBUG nova.virt.libvirt.driver [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:14:49 compute-0 nova_compute[189265]: 2025-09-30 07:14:49.312 2 DEBUG nova.virt.libvirt.driver [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:14:49 compute-0 nova_compute[189265]: 2025-09-30 07:14:49.313 2 DEBUG nova.virt.libvirt.driver [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:14:49 compute-0 nova_compute[189265]: 2025-09-30 07:14:49.314 2 DEBUG nova.virt.libvirt.driver [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:14:49 compute-0 nova_compute[189265]: 2025-09-30 07:14:49.315 2 DEBUG nova.virt.libvirt.driver [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:14:49.340 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[d1e79ab4-e541-4d05-8ef3-2591a1f0bf7e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap74ffbf65-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1f:ef:af'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 434702, 'reachable_time': 41674, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 212773, 'error': None, 'target': 'ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:14:49 compute-0 nova_compute[189265]: 2025-09-30 07:14:49.341 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:14:49.358 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[c75289ec-0744-4efa-89b7-abda29f8130a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1f:efaf'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 434702, 'tstamp': 434702}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 212774, 'error': None, 'target': 'ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:14:49.375 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[0db1d87a-4fba-43ae-aaf0-2d0a15cfd634]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap74ffbf65-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1f:ef:af'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 434702, 'reachable_time': 41674, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 212775, 'error': None, 'target': 'ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:14:49.407 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[907eee81-c36e-4284-bb53-c16b3592f274]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:14:49.468 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[7e79b1ea-00b6-41d2-b829-78f871f2a4f9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:14:49.469 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap74ffbf65-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:14:49.469 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:14:49.469 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap74ffbf65-e0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:14:49 compute-0 nova_compute[189265]: 2025-09-30 07:14:49.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:14:49 compute-0 NetworkManager[51813]: <info>  [1759216489.4717] manager: (tap74ffbf65-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/25)
Sep 30 07:14:49 compute-0 kernel: tap74ffbf65-e0: entered promiscuous mode
Sep 30 07:14:49 compute-0 nova_compute[189265]: 2025-09-30 07:14:49.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:14:49.474 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap74ffbf65-e0, col_values=(('external_ids', {'iface-id': '0c700e20-e593-4a77-93d7-fc919dc1f294'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:14:49 compute-0 nova_compute[189265]: 2025-09-30 07:14:49.474 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:14:49 compute-0 ovn_controller[91436]: 2025-09-30T07:14:49Z|00044|binding|INFO|Releasing lport 0c700e20-e593-4a77-93d7-fc919dc1f294 from this chassis (sb_readonly=0)
Sep 30 07:14:49 compute-0 nova_compute[189265]: 2025-09-30 07:14:49.484 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:14:49.486 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[f5bc1292-194d-4ff2-b256-aecc67cbd332]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:14:49.486 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/74ffbf65-ebbd-4587-bf5b-0b38421a4813.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/74ffbf65-ebbd-4587-bf5b-0b38421a4813.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:14:49.487 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/74ffbf65-ebbd-4587-bf5b-0b38421a4813.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/74ffbf65-ebbd-4587-bf5b-0b38421a4813.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:14:49.487 100322 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 74ffbf65-ebbd-4587-bf5b-0b38421a4813 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:14:49.487 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/74ffbf65-ebbd-4587-bf5b-0b38421a4813.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/74ffbf65-ebbd-4587-bf5b-0b38421a4813.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:14:49.487 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[16a059f8-001b-400e-b0c7-56b33e5e7630]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:14:49.488 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/74ffbf65-ebbd-4587-bf5b-0b38421a4813.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/74ffbf65-ebbd-4587-bf5b-0b38421a4813.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:14:49.488 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[7b3a0850-68b0-4be6-ae9c-03227fee3578]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:14:49.489 100322 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]: global
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]:     log         /dev/log local0 debug
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]:     log-tag     haproxy-metadata-proxy-74ffbf65-ebbd-4587-bf5b-0b38421a4813
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]:     user        root
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]:     group       root
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]:     maxconn     1024
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]:     pidfile     /var/lib/neutron/external/pids/74ffbf65-ebbd-4587-bf5b-0b38421a4813.pid.haproxy
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]:     daemon
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]: 
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]: defaults
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]:     log global
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]:     mode http
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]:     option httplog
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]:     option dontlognull
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]:     option http-server-close
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]:     option forwardfor
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]:     retries                 3
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]:     timeout http-request    30s
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]:     timeout connect         30s
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]:     timeout client          32s
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]:     timeout server          32s
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]:     timeout http-keep-alive 30s
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]: 
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]: listen listener
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]:     bind 169.254.169.254:80
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]:     
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]: 
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]:     http-request add-header X-OVN-Network-ID 74ffbf65-ebbd-4587-bf5b-0b38421a4813
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Sep 30 07:14:49 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:14:49.490 100322 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813', 'env', 'PROCESS_TAG=haproxy-74ffbf65-ebbd-4587-bf5b-0b38421a4813', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/74ffbf65-ebbd-4587-bf5b-0b38421a4813.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Sep 30 07:14:49 compute-0 nova_compute[189265]: 2025-09-30 07:14:49.807 2 DEBUG nova.compute.manager [req-0eaa21ae-9d9c-4ab8-9e01-a5c32519f2bb req-5c69cfd1-7067-4d9e-9d99-cb4b36e07b4a 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Received event network-vif-plugged-5e18274a-8ca2-4391-88b8-e5a90d72fc7c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:14:49 compute-0 nova_compute[189265]: 2025-09-30 07:14:49.808 2 DEBUG oslo_concurrency.lockutils [req-0eaa21ae-9d9c-4ab8-9e01-a5c32519f2bb req-5c69cfd1-7067-4d9e-9d99-cb4b36e07b4a 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "9fa193fb-a398-4552-85b4-a346dffcf697-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:14:49 compute-0 nova_compute[189265]: 2025-09-30 07:14:49.809 2 DEBUG oslo_concurrency.lockutils [req-0eaa21ae-9d9c-4ab8-9e01-a5c32519f2bb req-5c69cfd1-7067-4d9e-9d99-cb4b36e07b4a 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "9fa193fb-a398-4552-85b4-a346dffcf697-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:14:49 compute-0 nova_compute[189265]: 2025-09-30 07:14:49.809 2 DEBUG oslo_concurrency.lockutils [req-0eaa21ae-9d9c-4ab8-9e01-a5c32519f2bb req-5c69cfd1-7067-4d9e-9d99-cb4b36e07b4a 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "9fa193fb-a398-4552-85b4-a346dffcf697-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:14:49 compute-0 nova_compute[189265]: 2025-09-30 07:14:49.809 2 DEBUG nova.compute.manager [req-0eaa21ae-9d9c-4ab8-9e01-a5c32519f2bb req-5c69cfd1-7067-4d9e-9d99-cb4b36e07b4a 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] No waiting events found dispatching network-vif-plugged-5e18274a-8ca2-4391-88b8-e5a90d72fc7c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:14:49 compute-0 nova_compute[189265]: 2025-09-30 07:14:49.810 2 WARNING nova.compute.manager [req-0eaa21ae-9d9c-4ab8-9e01-a5c32519f2bb req-5c69cfd1-7067-4d9e-9d99-cb4b36e07b4a 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Received unexpected event network-vif-plugged-5e18274a-8ca2-4391-88b8-e5a90d72fc7c for instance with vm_state building and task_state spawning.
Sep 30 07:14:49 compute-0 nova_compute[189265]: 2025-09-30 07:14:49.840 2 INFO nova.compute.manager [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Took 14.83 seconds to spawn the instance on the hypervisor.
Sep 30 07:14:49 compute-0 nova_compute[189265]: 2025-09-30 07:14:49.841 2 DEBUG nova.compute.manager [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 07:14:49 compute-0 podman[212806]: 2025-09-30 07:14:49.93155014 +0000 UTC m=+0.073934263 container create 4f8fa3b9e27071334d0f82812cc3f92f254f7c3e1c81ac2da6da7bed7c85f659 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true)
Sep 30 07:14:49 compute-0 podman[212806]: 2025-09-30 07:14:49.882872287 +0000 UTC m=+0.025256460 image pull eeebcc09bc72f81ab45f5ab87eb8f6a7b554b949227aeec082bdb0732754ddc8 38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 07:14:49 compute-0 systemd[1]: Started libpod-conmon-4f8fa3b9e27071334d0f82812cc3f92f254f7c3e1c81ac2da6da7bed7c85f659.scope.
Sep 30 07:14:50 compute-0 systemd[1]: Started libcrun container.
Sep 30 07:14:50 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/de4e4deace1333bc25798c2cea434bd6e9718bac418016a5573694616caedb8d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 07:14:50 compute-0 podman[212806]: 2025-09-30 07:14:50.050795428 +0000 UTC m=+0.193179591 container init 4f8fa3b9e27071334d0f82812cc3f92f254f7c3e1c81ac2da6da7bed7c85f659 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4)
Sep 30 07:14:50 compute-0 podman[212806]: 2025-09-30 07:14:50.058027507 +0000 UTC m=+0.200411630 container start 4f8fa3b9e27071334d0f82812cc3f92f254f7c3e1c81ac2da6da7bed7c85f659 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930)
Sep 30 07:14:50 compute-0 neutron-haproxy-ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813[212820]: [NOTICE]   (212824) : New worker (212826) forked
Sep 30 07:14:50 compute-0 neutron-haproxy-ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813[212820]: [NOTICE]   (212824) : Loading success.
Sep 30 07:14:50 compute-0 nova_compute[189265]: 2025-09-30 07:14:50.383 2 INFO nova.compute.manager [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Took 20.57 seconds to build instance.
Sep 30 07:14:50 compute-0 nova_compute[189265]: 2025-09-30 07:14:50.892 2 DEBUG oslo_concurrency.lockutils [None req-0f175e21-a519-4aa9-8519-24c75106c1bc d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "9fa193fb-a398-4552-85b4-a346dffcf697" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 22.170s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:14:51 compute-0 nova_compute[189265]: 2025-09-30 07:14:51.708 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:14:54 compute-0 nova_compute[189265]: 2025-09-30 07:14:54.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:14:56 compute-0 nova_compute[189265]: 2025-09-30 07:14:56.716 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:14:58 compute-0 podman[212836]: 2025-09-30 07:14:58.480815237 +0000 UTC m=+0.063545523 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 07:14:59 compute-0 nova_compute[189265]: 2025-09-30 07:14:59.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:14:59 compute-0 podman[199733]: time="2025-09-30T07:14:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:14:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:14:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 07:14:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:14:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3451 "" "Go-http-client/1.1"
Sep 30 07:15:00 compute-0 ovn_controller[91436]: 2025-09-30T07:15:00Z|00003|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:0d:a8:de 10.100.0.4
Sep 30 07:15:00 compute-0 ovn_controller[91436]: 2025-09-30T07:15:00Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0d:a8:de 10.100.0.4
Sep 30 07:15:01 compute-0 openstack_network_exporter[201859]: ERROR   07:15:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:15:01 compute-0 openstack_network_exporter[201859]: ERROR   07:15:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:15:01 compute-0 openstack_network_exporter[201859]: ERROR   07:15:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:15:01 compute-0 openstack_network_exporter[201859]: ERROR   07:15:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:15:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:15:01 compute-0 openstack_network_exporter[201859]: ERROR   07:15:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:15:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:15:01 compute-0 nova_compute[189265]: 2025-09-30 07:15:01.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:15:04 compute-0 nova_compute[189265]: 2025-09-30 07:15:04.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:15:06 compute-0 nova_compute[189265]: 2025-09-30 07:15:06.718 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:15:08 compute-0 podman[212872]: 2025-09-30 07:15:08.50344684 +0000 UTC m=+0.080756239 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, container_name=iscsid, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 07:15:09 compute-0 nova_compute[189265]: 2025-09-30 07:15:09.351 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:15:11 compute-0 nova_compute[189265]: 2025-09-30 07:15:11.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:15:12 compute-0 podman[212894]: 2025-09-30 07:15:12.51318785 +0000 UTC m=+0.093064094 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, version=9.6, architecture=x86_64, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container)
Sep 30 07:15:14 compute-0 nova_compute[189265]: 2025-09-30 07:15:14.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:15:14 compute-0 podman[212916]: 2025-09-30 07:15:14.486288814 +0000 UTC m=+0.073708826 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd)
Sep 30 07:15:14 compute-0 podman[212917]: 2025-09-30 07:15:14.537545342 +0000 UTC m=+0.112792943 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4)
Sep 30 07:15:16 compute-0 podman[212962]: 2025-09-30 07:15:16.528528653 +0000 UTC m=+0.101322333 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible)
Sep 30 07:15:16 compute-0 nova_compute[189265]: 2025-09-30 07:15:16.772 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:15:19 compute-0 nova_compute[189265]: 2025-09-30 07:15:19.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:15:19 compute-0 ovn_controller[91436]: 2025-09-30T07:15:19Z|00045|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Sep 30 07:15:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:15:20.538 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:15:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:15:20.538 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:15:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:15:20.539 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:15:21 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:15:21.446 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '1a:26:7c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2e:60:fa:91:d0:34'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:15:21 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:15:21.446 100322 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 07:15:21 compute-0 nova_compute[189265]: 2025-09-30 07:15:21.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:15:21 compute-0 nova_compute[189265]: 2025-09-30 07:15:21.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:15:24 compute-0 nova_compute[189265]: 2025-09-30 07:15:24.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:15:26 compute-0 nova_compute[189265]: 2025-09-30 07:15:26.775 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:15:29 compute-0 nova_compute[189265]: 2025-09-30 07:15:29.357 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:15:29 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:15:29.447 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=01429670-4ea1-4dab-babc-4bc628cc01bb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:15:29 compute-0 podman[212983]: 2025-09-30 07:15:29.488265036 +0000 UTC m=+0.069026802 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 07:15:29 compute-0 podman[199733]: time="2025-09-30T07:15:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:15:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:15:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 07:15:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:15:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3459 "" "Go-http-client/1.1"
Sep 30 07:15:30 compute-0 nova_compute[189265]: 2025-09-30 07:15:30.783 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:15:31 compute-0 openstack_network_exporter[201859]: ERROR   07:15:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:15:31 compute-0 openstack_network_exporter[201859]: ERROR   07:15:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:15:31 compute-0 openstack_network_exporter[201859]: ERROR   07:15:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:15:31 compute-0 openstack_network_exporter[201859]: ERROR   07:15:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:15:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:15:31 compute-0 openstack_network_exporter[201859]: ERROR   07:15:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:15:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:15:31 compute-0 nova_compute[189265]: 2025-09-30 07:15:31.776 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:15:31 compute-0 nova_compute[189265]: 2025-09-30 07:15:31.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:15:32 compute-0 nova_compute[189265]: 2025-09-30 07:15:32.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:15:32 compute-0 nova_compute[189265]: 2025-09-30 07:15:32.789 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:15:32 compute-0 nova_compute[189265]: 2025-09-30 07:15:32.790 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:15:32 compute-0 nova_compute[189265]: 2025-09-30 07:15:32.790 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:15:33 compute-0 nova_compute[189265]: 2025-09-30 07:15:33.789 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:15:34 compute-0 nova_compute[189265]: 2025-09-30 07:15:34.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:15:34 compute-0 nova_compute[189265]: 2025-09-30 07:15:34.784 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:15:35 compute-0 nova_compute[189265]: 2025-09-30 07:15:35.294 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:15:35 compute-0 nova_compute[189265]: 2025-09-30 07:15:35.816 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:15:35 compute-0 nova_compute[189265]: 2025-09-30 07:15:35.817 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:15:35 compute-0 nova_compute[189265]: 2025-09-30 07:15:35.817 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:15:35 compute-0 nova_compute[189265]: 2025-09-30 07:15:35.818 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:15:36 compute-0 nova_compute[189265]: 2025-09-30 07:15:36.777 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:15:36 compute-0 nova_compute[189265]: 2025-09-30 07:15:36.869 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9fa193fb-a398-4552-85b4-a346dffcf697/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:15:36 compute-0 nova_compute[189265]: 2025-09-30 07:15:36.958 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9fa193fb-a398-4552-85b4-a346dffcf697/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:15:36 compute-0 nova_compute[189265]: 2025-09-30 07:15:36.958 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9fa193fb-a398-4552-85b4-a346dffcf697/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:15:37 compute-0 nova_compute[189265]: 2025-09-30 07:15:37.023 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9fa193fb-a398-4552-85b4-a346dffcf697/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:15:37 compute-0 nova_compute[189265]: 2025-09-30 07:15:37.176 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:15:37 compute-0 nova_compute[189265]: 2025-09-30 07:15:37.177 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:15:37 compute-0 nova_compute[189265]: 2025-09-30 07:15:37.194 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:15:37 compute-0 nova_compute[189265]: 2025-09-30 07:15:37.195 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5697MB free_disk=73.28014755249023GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:15:37 compute-0 nova_compute[189265]: 2025-09-30 07:15:37.195 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:15:37 compute-0 nova_compute[189265]: 2025-09-30 07:15:37.195 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:15:38 compute-0 nova_compute[189265]: 2025-09-30 07:15:38.248 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Instance 9fa193fb-a398-4552-85b4-a346dffcf697 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 07:15:38 compute-0 nova_compute[189265]: 2025-09-30 07:15:38.249 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:15:38 compute-0 nova_compute[189265]: 2025-09-30 07:15:38.249 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:15:37 up  1:13,  0 user,  load average: 0.32, 0.31, 0.45\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_1413b21c2db845e58d8a81f524a55f3a': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:15:38 compute-0 nova_compute[189265]: 2025-09-30 07:15:38.317 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Updating inventory in ProviderTree for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Sep 30 07:15:38 compute-0 nova_compute[189265]: 2025-09-30 07:15:38.878 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Updated inventory for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:975
Sep 30 07:15:38 compute-0 nova_compute[189265]: 2025-09-30 07:15:38.879 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Updating resource provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Sep 30 07:15:38 compute-0 nova_compute[189265]: 2025-09-30 07:15:38.879 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Updating inventory in ProviderTree for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Sep 30 07:15:39 compute-0 nova_compute[189265]: 2025-09-30 07:15:39.360 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:15:39 compute-0 nova_compute[189265]: 2025-09-30 07:15:39.438 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:15:39 compute-0 nova_compute[189265]: 2025-09-30 07:15:39.439 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.243s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:15:39 compute-0 podman[213014]: 2025-09-30 07:15:39.498321294 +0000 UTC m=+0.080726609 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.4, config_id=iscsid, org.label-schema.build-date=20250930, tcib_managed=true)
Sep 30 07:15:41 compute-0 nova_compute[189265]: 2025-09-30 07:15:41.778 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:15:41 compute-0 nova_compute[189265]: 2025-09-30 07:15:41.932 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:15:43 compute-0 podman[213034]: 2025-09-30 07:15:43.509023763 +0000 UTC m=+0.088513743 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, release=1755695350, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, container_name=openstack_network_exporter, io.buildah.version=1.33.7)
Sep 30 07:15:44 compute-0 nova_compute[189265]: 2025-09-30 07:15:44.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:15:45 compute-0 podman[213055]: 2025-09-30 07:15:45.475120105 +0000 UTC m=+0.061810793 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.build-date=20250930)
Sep 30 07:15:45 compute-0 podman[213056]: 2025-09-30 07:15:45.539202763 +0000 UTC m=+0.121808503 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller)
Sep 30 07:15:46 compute-0 nova_compute[189265]: 2025-09-30 07:15:46.780 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:15:47 compute-0 podman[213102]: 2025-09-30 07:15:47.500610591 +0000 UTC m=+0.079468122 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Sep 30 07:15:49 compute-0 nova_compute[189265]: 2025-09-30 07:15:49.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:15:51 compute-0 nova_compute[189265]: 2025-09-30 07:15:51.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:15:54 compute-0 nova_compute[189265]: 2025-09-30 07:15:54.365 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:15:56 compute-0 nova_compute[189265]: 2025-09-30 07:15:56.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:15:59 compute-0 nova_compute[189265]: 2025-09-30 07:15:59.366 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:15:59 compute-0 podman[199733]: time="2025-09-30T07:15:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:15:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:15:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 07:15:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:15:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3460 "" "Go-http-client/1.1"
Sep 30 07:16:00 compute-0 podman[213120]: 2025-09-30 07:16:00.482873332 +0000 UTC m=+0.071994866 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 07:16:01 compute-0 openstack_network_exporter[201859]: ERROR   07:16:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:16:01 compute-0 openstack_network_exporter[201859]: ERROR   07:16:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:16:01 compute-0 openstack_network_exporter[201859]: ERROR   07:16:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:16:01 compute-0 openstack_network_exporter[201859]: ERROR   07:16:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:16:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:16:01 compute-0 openstack_network_exporter[201859]: ERROR   07:16:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:16:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:16:01 compute-0 nova_compute[189265]: 2025-09-30 07:16:01.786 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:16:04 compute-0 nova_compute[189265]: 2025-09-30 07:16:04.367 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:16:06 compute-0 nova_compute[189265]: 2025-09-30 07:16:06.788 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:16:07 compute-0 nova_compute[189265]: 2025-09-30 07:16:07.163 2 DEBUG oslo_concurrency.lockutils [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Acquiring lock "d40a0fba-a20e-4dcf-a048-10d9e21c6cf6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:16:07 compute-0 nova_compute[189265]: 2025-09-30 07:16:07.164 2 DEBUG oslo_concurrency.lockutils [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "d40a0fba-a20e-4dcf-a048-10d9e21c6cf6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:16:07 compute-0 nova_compute[189265]: 2025-09-30 07:16:07.671 2 DEBUG nova.compute.manager [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Sep 30 07:16:08 compute-0 nova_compute[189265]: 2025-09-30 07:16:08.283 2 DEBUG oslo_concurrency.lockutils [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:16:08 compute-0 nova_compute[189265]: 2025-09-30 07:16:08.284 2 DEBUG oslo_concurrency.lockutils [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:16:08 compute-0 nova_compute[189265]: 2025-09-30 07:16:08.292 2 DEBUG nova.virt.hardware [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Sep 30 07:16:08 compute-0 nova_compute[189265]: 2025-09-30 07:16:08.293 2 INFO nova.compute.claims [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Claim successful on node compute-0.ctlplane.example.com
Sep 30 07:16:09 compute-0 nova_compute[189265]: 2025-09-30 07:16:09.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:16:09 compute-0 nova_compute[189265]: 2025-09-30 07:16:09.428 2 DEBUG nova.compute.provider_tree [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:16:09 compute-0 nova_compute[189265]: 2025-09-30 07:16:09.939 2 DEBUG nova.scheduler.client.report [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:16:10 compute-0 nova_compute[189265]: 2025-09-30 07:16:10.457 2 DEBUG oslo_concurrency.lockutils [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.173s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:16:10 compute-0 nova_compute[189265]: 2025-09-30 07:16:10.458 2 DEBUG nova.compute.manager [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Sep 30 07:16:10 compute-0 podman[213159]: 2025-09-30 07:16:10.494748053 +0000 UTC m=+0.073875371 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 07:16:10 compute-0 nova_compute[189265]: 2025-09-30 07:16:10.975 2 DEBUG nova.compute.manager [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Sep 30 07:16:10 compute-0 nova_compute[189265]: 2025-09-30 07:16:10.976 2 DEBUG nova.network.neutron [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Sep 30 07:16:10 compute-0 nova_compute[189265]: 2025-09-30 07:16:10.976 2 WARNING neutronclient.v2_0.client [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:16:10 compute-0 nova_compute[189265]: 2025-09-30 07:16:10.977 2 WARNING neutronclient.v2_0.client [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:16:11 compute-0 nova_compute[189265]: 2025-09-30 07:16:11.491 2 INFO nova.virt.libvirt.driver [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 07:16:11 compute-0 nova_compute[189265]: 2025-09-30 07:16:11.791 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:16:12 compute-0 nova_compute[189265]: 2025-09-30 07:16:12.006 2 DEBUG nova.compute.manager [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Sep 30 07:16:13 compute-0 nova_compute[189265]: 2025-09-30 07:16:13.094 2 DEBUG nova.compute.manager [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Sep 30 07:16:13 compute-0 nova_compute[189265]: 2025-09-30 07:16:13.096 2 DEBUG nova.virt.libvirt.driver [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Sep 30 07:16:13 compute-0 nova_compute[189265]: 2025-09-30 07:16:13.096 2 INFO nova.virt.libvirt.driver [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Creating image(s)
Sep 30 07:16:13 compute-0 nova_compute[189265]: 2025-09-30 07:16:13.097 2 DEBUG oslo_concurrency.lockutils [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Acquiring lock "/var/lib/nova/instances/d40a0fba-a20e-4dcf-a048-10d9e21c6cf6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:16:13 compute-0 nova_compute[189265]: 2025-09-30 07:16:13.097 2 DEBUG oslo_concurrency.lockutils [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "/var/lib/nova/instances/d40a0fba-a20e-4dcf-a048-10d9e21c6cf6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:16:13 compute-0 nova_compute[189265]: 2025-09-30 07:16:13.098 2 DEBUG oslo_concurrency.lockutils [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "/var/lib/nova/instances/d40a0fba-a20e-4dcf-a048-10d9e21c6cf6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:16:13 compute-0 nova_compute[189265]: 2025-09-30 07:16:13.099 2 DEBUG oslo_utils.imageutils.format_inspector [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:16:13 compute-0 nova_compute[189265]: 2025-09-30 07:16:13.103 2 DEBUG oslo_utils.imageutils.format_inspector [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:16:13 compute-0 nova_compute[189265]: 2025-09-30 07:16:13.105 2 DEBUG oslo_concurrency.processutils [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:16:13 compute-0 nova_compute[189265]: 2025-09-30 07:16:13.166 2 DEBUG oslo_concurrency.processutils [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:16:13 compute-0 nova_compute[189265]: 2025-09-30 07:16:13.167 2 DEBUG oslo_concurrency.lockutils [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Acquiring lock "649c128805005f3dfb5a93843c58a367cdfe939d" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:16:13 compute-0 nova_compute[189265]: 2025-09-30 07:16:13.167 2 DEBUG oslo_concurrency.lockutils [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "649c128805005f3dfb5a93843c58a367cdfe939d" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:16:13 compute-0 nova_compute[189265]: 2025-09-30 07:16:13.168 2 DEBUG oslo_utils.imageutils.format_inspector [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:16:13 compute-0 nova_compute[189265]: 2025-09-30 07:16:13.173 2 DEBUG oslo_utils.imageutils.format_inspector [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:16:13 compute-0 nova_compute[189265]: 2025-09-30 07:16:13.173 2 DEBUG oslo_concurrency.processutils [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:16:13 compute-0 nova_compute[189265]: 2025-09-30 07:16:13.221 2 DEBUG oslo_concurrency.processutils [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:16:13 compute-0 nova_compute[189265]: 2025-09-30 07:16:13.222 2 DEBUG oslo_concurrency.processutils [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d,backing_fmt=raw /var/lib/nova/instances/d40a0fba-a20e-4dcf-a048-10d9e21c6cf6/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:16:13 compute-0 nova_compute[189265]: 2025-09-30 07:16:13.277 2 DEBUG oslo_concurrency.processutils [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d,backing_fmt=raw /var/lib/nova/instances/d40a0fba-a20e-4dcf-a048-10d9e21c6cf6/disk 1073741824" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:16:13 compute-0 nova_compute[189265]: 2025-09-30 07:16:13.278 2 DEBUG oslo_concurrency.lockutils [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "649c128805005f3dfb5a93843c58a367cdfe939d" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.110s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:16:13 compute-0 nova_compute[189265]: 2025-09-30 07:16:13.278 2 DEBUG oslo_concurrency.processutils [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:16:13 compute-0 nova_compute[189265]: 2025-09-30 07:16:13.328 2 DEBUG oslo_concurrency.processutils [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:16:13 compute-0 nova_compute[189265]: 2025-09-30 07:16:13.329 2 DEBUG nova.virt.disk.api [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Checking if we can resize image /var/lib/nova/instances/d40a0fba-a20e-4dcf-a048-10d9e21c6cf6/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Sep 30 07:16:13 compute-0 nova_compute[189265]: 2025-09-30 07:16:13.330 2 DEBUG oslo_concurrency.processutils [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d40a0fba-a20e-4dcf-a048-10d9e21c6cf6/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:16:13 compute-0 nova_compute[189265]: 2025-09-30 07:16:13.382 2 DEBUG oslo_concurrency.processutils [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d40a0fba-a20e-4dcf-a048-10d9e21c6cf6/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:16:13 compute-0 nova_compute[189265]: 2025-09-30 07:16:13.384 2 DEBUG nova.virt.disk.api [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Cannot resize image /var/lib/nova/instances/d40a0fba-a20e-4dcf-a048-10d9e21c6cf6/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Sep 30 07:16:13 compute-0 nova_compute[189265]: 2025-09-30 07:16:13.385 2 DEBUG nova.virt.libvirt.driver [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Sep 30 07:16:13 compute-0 nova_compute[189265]: 2025-09-30 07:16:13.385 2 DEBUG nova.virt.libvirt.driver [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Ensure instance console log exists: /var/lib/nova/instances/d40a0fba-a20e-4dcf-a048-10d9e21c6cf6/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 07:16:13 compute-0 nova_compute[189265]: 2025-09-30 07:16:13.386 2 DEBUG oslo_concurrency.lockutils [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:16:13 compute-0 nova_compute[189265]: 2025-09-30 07:16:13.386 2 DEBUG oslo_concurrency.lockutils [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:16:13 compute-0 nova_compute[189265]: 2025-09-30 07:16:13.387 2 DEBUG oslo_concurrency.lockutils [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:16:14 compute-0 nova_compute[189265]: 2025-09-30 07:16:14.355 2 DEBUG nova.network.neutron [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Successfully created port: dd1a8613-e62a-44c6-9960-a46776a2c059 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Sep 30 07:16:14 compute-0 nova_compute[189265]: 2025-09-30 07:16:14.370 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:16:14 compute-0 podman[213194]: 2025-09-30 07:16:14.516618134 +0000 UTC m=+0.091777757 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, distribution-scope=public, io.buildah.version=1.33.7, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., release=1755695350, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.openshift.tags=minimal rhel9)
Sep 30 07:16:15 compute-0 nova_compute[189265]: 2025-09-30 07:16:15.150 2 DEBUG nova.network.neutron [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Successfully updated port: dd1a8613-e62a-44c6-9960-a46776a2c059 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Sep 30 07:16:15 compute-0 nova_compute[189265]: 2025-09-30 07:16:15.244 2 DEBUG nova.compute.manager [req-7068f9a6-f043-4ea6-8623-76dc288180ad req-17852aad-21f9-497e-ba47-fe1794b19f5d 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Received event network-changed-dd1a8613-e62a-44c6-9960-a46776a2c059 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:16:15 compute-0 nova_compute[189265]: 2025-09-30 07:16:15.244 2 DEBUG nova.compute.manager [req-7068f9a6-f043-4ea6-8623-76dc288180ad req-17852aad-21f9-497e-ba47-fe1794b19f5d 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Refreshing instance network info cache due to event network-changed-dd1a8613-e62a-44c6-9960-a46776a2c059. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 07:16:15 compute-0 nova_compute[189265]: 2025-09-30 07:16:15.245 2 DEBUG oslo_concurrency.lockutils [req-7068f9a6-f043-4ea6-8623-76dc288180ad req-17852aad-21f9-497e-ba47-fe1794b19f5d 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "refresh_cache-d40a0fba-a20e-4dcf-a048-10d9e21c6cf6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:16:15 compute-0 nova_compute[189265]: 2025-09-30 07:16:15.245 2 DEBUG oslo_concurrency.lockutils [req-7068f9a6-f043-4ea6-8623-76dc288180ad req-17852aad-21f9-497e-ba47-fe1794b19f5d 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquired lock "refresh_cache-d40a0fba-a20e-4dcf-a048-10d9e21c6cf6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:16:15 compute-0 nova_compute[189265]: 2025-09-30 07:16:15.245 2 DEBUG nova.network.neutron [req-7068f9a6-f043-4ea6-8623-76dc288180ad req-17852aad-21f9-497e-ba47-fe1794b19f5d 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Refreshing network info cache for port dd1a8613-e62a-44c6-9960-a46776a2c059 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 07:16:15 compute-0 nova_compute[189265]: 2025-09-30 07:16:15.676 2 DEBUG oslo_concurrency.lockutils [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Acquiring lock "refresh_cache-d40a0fba-a20e-4dcf-a048-10d9e21c6cf6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:16:15 compute-0 nova_compute[189265]: 2025-09-30 07:16:15.763 2 WARNING neutronclient.v2_0.client [req-7068f9a6-f043-4ea6-8623-76dc288180ad req-17852aad-21f9-497e-ba47-fe1794b19f5d 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:16:16 compute-0 nova_compute[189265]: 2025-09-30 07:16:16.146 2 DEBUG nova.network.neutron [req-7068f9a6-f043-4ea6-8623-76dc288180ad req-17852aad-21f9-497e-ba47-fe1794b19f5d 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 07:16:16 compute-0 nova_compute[189265]: 2025-09-30 07:16:16.313 2 DEBUG nova.network.neutron [req-7068f9a6-f043-4ea6-8623-76dc288180ad req-17852aad-21f9-497e-ba47-fe1794b19f5d 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:16:16 compute-0 podman[213215]: 2025-09-30 07:16:16.50916675 +0000 UTC m=+0.079743081 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Sep 30 07:16:16 compute-0 podman[213216]: 2025-09-30 07:16:16.550597615 +0000 UTC m=+0.118761676 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Sep 30 07:16:16 compute-0 nova_compute[189265]: 2025-09-30 07:16:16.794 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:16:16 compute-0 nova_compute[189265]: 2025-09-30 07:16:16.907 2 DEBUG oslo_concurrency.lockutils [req-7068f9a6-f043-4ea6-8623-76dc288180ad req-17852aad-21f9-497e-ba47-fe1794b19f5d 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Releasing lock "refresh_cache-d40a0fba-a20e-4dcf-a048-10d9e21c6cf6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:16:16 compute-0 nova_compute[189265]: 2025-09-30 07:16:16.908 2 DEBUG oslo_concurrency.lockutils [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Acquired lock "refresh_cache-d40a0fba-a20e-4dcf-a048-10d9e21c6cf6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:16:16 compute-0 nova_compute[189265]: 2025-09-30 07:16:16.908 2 DEBUG nova.network.neutron [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 07:16:17 compute-0 nova_compute[189265]: 2025-09-30 07:16:17.664 2 DEBUG nova.network.neutron [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 07:16:17 compute-0 nova_compute[189265]: 2025-09-30 07:16:17.897 2 WARNING neutronclient.v2_0.client [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:16:18 compute-0 nova_compute[189265]: 2025-09-30 07:16:18.029 2 DEBUG nova.network.neutron [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Updating instance_info_cache with network_info: [{"id": "dd1a8613-e62a-44c6-9960-a46776a2c059", "address": "fa:16:3e:ac:12:53", "network": {"id": "74ffbf65-ebbd-4587-bf5b-0b38421a4813", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1315246804-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1dc2a906d2242f79ffab81c2cf3c4d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd1a8613-e6", "ovs_interfaceid": "dd1a8613-e62a-44c6-9960-a46776a2c059", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:16:18 compute-0 podman[213262]: 2025-09-30 07:16:18.467340494 +0000 UTC m=+0.056139260 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 07:16:18 compute-0 nova_compute[189265]: 2025-09-30 07:16:18.536 2 DEBUG oslo_concurrency.lockutils [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Releasing lock "refresh_cache-d40a0fba-a20e-4dcf-a048-10d9e21c6cf6" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:16:18 compute-0 nova_compute[189265]: 2025-09-30 07:16:18.536 2 DEBUG nova.compute.manager [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Instance network_info: |[{"id": "dd1a8613-e62a-44c6-9960-a46776a2c059", "address": "fa:16:3e:ac:12:53", "network": {"id": "74ffbf65-ebbd-4587-bf5b-0b38421a4813", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1315246804-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1dc2a906d2242f79ffab81c2cf3c4d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd1a8613-e6", "ovs_interfaceid": "dd1a8613-e62a-44c6-9960-a46776a2c059", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Sep 30 07:16:18 compute-0 nova_compute[189265]: 2025-09-30 07:16:18.538 2 DEBUG nova.virt.libvirt.driver [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Start _get_guest_xml network_info=[{"id": "dd1a8613-e62a-44c6-9960-a46776a2c059", "address": "fa:16:3e:ac:12:53", "network": {"id": "74ffbf65-ebbd-4587-bf5b-0b38421a4813", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1315246804-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1dc2a906d2242f79ffab81c2cf3c4d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd1a8613-e6", "ovs_interfaceid": "dd1a8613-e62a-44c6-9960-a46776a2c059", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T07:07:59Z,direct_url=<?>,disk_format='qcow2',id=0c6b92f5-9861-49e4-862d-3ffd84520dfa,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4049964ce8244dacb50493f6676c6613',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T07:08:00Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'encryption_format': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encrypted': False, 'image_id': '0c6b92f5-9861-49e4-862d-3ffd84520dfa'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Sep 30 07:16:18 compute-0 nova_compute[189265]: 2025-09-30 07:16:18.541 2 WARNING nova.virt.libvirt.driver [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:16:18 compute-0 nova_compute[189265]: 2025-09-30 07:16:18.542 2 DEBUG nova.virt.driver [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='0c6b92f5-9861-49e4-862d-3ffd84520dfa', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteActionsViaActuator-server-1824333052', uuid='d40a0fba-a20e-4dcf-a048-10d9e21c6cf6'), owner=OwnerMeta(userid='d6cb6be5d6fc407eb3abc1c7c70f5d77', username='tempest-TestExecuteActionsViaActuator-2061885601-project-admin', projectid='1413b21c2db845e58d8a81f524a55f3a', projectname='tempest-TestExecuteActionsViaActuator-2061885601'), image=ImageMeta(id='0c6b92f5-9861-49e4-862d-3ffd84520dfa', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='ded17455-f8fe-40c7-8dae-6f0a2b208ae0', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "dd1a8613-e62a-44c6-9960-a46776a2c059", "address": "fa:16:3e:ac:12:53", "network": {"id": "74ffbf65-ebbd-4587-bf5b-0b38421a4813", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1315246804-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1dc2a906d2242f79ffab81c2cf3c4d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd1a8613-e6", "ovs_interfaceid": "dd1a8613-e62a-44c6-9960-a46776a2c059", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759216578.5427063) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Sep 30 07:16:18 compute-0 nova_compute[189265]: 2025-09-30 07:16:18.547 2 DEBUG nova.virt.libvirt.host [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Sep 30 07:16:18 compute-0 nova_compute[189265]: 2025-09-30 07:16:18.548 2 DEBUG nova.virt.libvirt.host [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Sep 30 07:16:18 compute-0 nova_compute[189265]: 2025-09-30 07:16:18.551 2 DEBUG nova.virt.libvirt.host [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Sep 30 07:16:18 compute-0 nova_compute[189265]: 2025-09-30 07:16:18.551 2 DEBUG nova.virt.libvirt.host [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Sep 30 07:16:18 compute-0 nova_compute[189265]: 2025-09-30 07:16:18.551 2 DEBUG nova.virt.libvirt.driver [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 07:16:18 compute-0 nova_compute[189265]: 2025-09-30 07:16:18.551 2 DEBUG nova.virt.hardware [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T07:07:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='ded17455-f8fe-40c7-8dae-6f0a2b208ae0',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T07:07:59Z,direct_url=<?>,disk_format='qcow2',id=0c6b92f5-9861-49e4-862d-3ffd84520dfa,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4049964ce8244dacb50493f6676c6613',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T07:08:00Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Sep 30 07:16:18 compute-0 nova_compute[189265]: 2025-09-30 07:16:18.552 2 DEBUG nova.virt.hardware [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Sep 30 07:16:18 compute-0 nova_compute[189265]: 2025-09-30 07:16:18.552 2 DEBUG nova.virt.hardware [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Sep 30 07:16:18 compute-0 nova_compute[189265]: 2025-09-30 07:16:18.552 2 DEBUG nova.virt.hardware [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Sep 30 07:16:18 compute-0 nova_compute[189265]: 2025-09-30 07:16:18.552 2 DEBUG nova.virt.hardware [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Sep 30 07:16:18 compute-0 nova_compute[189265]: 2025-09-30 07:16:18.552 2 DEBUG nova.virt.hardware [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Sep 30 07:16:18 compute-0 nova_compute[189265]: 2025-09-30 07:16:18.552 2 DEBUG nova.virt.hardware [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Sep 30 07:16:18 compute-0 nova_compute[189265]: 2025-09-30 07:16:18.553 2 DEBUG nova.virt.hardware [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Sep 30 07:16:18 compute-0 nova_compute[189265]: 2025-09-30 07:16:18.553 2 DEBUG nova.virt.hardware [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Sep 30 07:16:18 compute-0 nova_compute[189265]: 2025-09-30 07:16:18.553 2 DEBUG nova.virt.hardware [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Sep 30 07:16:18 compute-0 nova_compute[189265]: 2025-09-30 07:16:18.553 2 DEBUG nova.virt.hardware [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Sep 30 07:16:18 compute-0 nova_compute[189265]: 2025-09-30 07:16:18.556 2 DEBUG nova.virt.libvirt.vif [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T07:16:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1824333052',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1824333052',id=5,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1413b21c2db845e58d8a81f524a55f3a',ramdisk_id='',reservation_id='r-o0ztmckl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-2061885601',owner_user_name='tempest-TestExecuteActionsViaActuator-2061885601-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T07:16:12Z,user_data=None,user_id='d6cb6be5d6fc407eb3abc1c7c70f5d77',uuid=d40a0fba-a20e-4dcf-a048-10d9e21c6cf6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dd1a8613-e62a-44c6-9960-a46776a2c059", "address": "fa:16:3e:ac:12:53", "network": {"id": "74ffbf65-ebbd-4587-bf5b-0b38421a4813", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1315246804-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1dc2a906d2242f79ffab81c2cf3c4d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd1a8613-e6", "ovs_interfaceid": "dd1a8613-e62a-44c6-9960-a46776a2c059", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 07:16:18 compute-0 nova_compute[189265]: 2025-09-30 07:16:18.556 2 DEBUG nova.network.os_vif_util [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Converting VIF {"id": "dd1a8613-e62a-44c6-9960-a46776a2c059", "address": "fa:16:3e:ac:12:53", "network": {"id": "74ffbf65-ebbd-4587-bf5b-0b38421a4813", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1315246804-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1dc2a906d2242f79ffab81c2cf3c4d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd1a8613-e6", "ovs_interfaceid": "dd1a8613-e62a-44c6-9960-a46776a2c059", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:16:18 compute-0 nova_compute[189265]: 2025-09-30 07:16:18.557 2 DEBUG nova.network.os_vif_util [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ac:12:53,bridge_name='br-int',has_traffic_filtering=True,id=dd1a8613-e62a-44c6-9960-a46776a2c059,network=Network(74ffbf65-ebbd-4587-bf5b-0b38421a4813),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd1a8613-e6') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:16:18 compute-0 nova_compute[189265]: 2025-09-30 07:16:18.557 2 DEBUG nova.objects.instance [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lazy-loading 'pci_devices' on Instance uuid d40a0fba-a20e-4dcf-a048-10d9e21c6cf6 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:16:19 compute-0 nova_compute[189265]: 2025-09-30 07:16:19.071 2 DEBUG nova.virt.libvirt.driver [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] End _get_guest_xml xml=<domain type="kvm">
Sep 30 07:16:19 compute-0 nova_compute[189265]:   <uuid>d40a0fba-a20e-4dcf-a048-10d9e21c6cf6</uuid>
Sep 30 07:16:19 compute-0 nova_compute[189265]:   <name>instance-00000005</name>
Sep 30 07:16:19 compute-0 nova_compute[189265]:   <memory>131072</memory>
Sep 30 07:16:19 compute-0 nova_compute[189265]:   <vcpu>1</vcpu>
Sep 30 07:16:19 compute-0 nova_compute[189265]:   <metadata>
Sep 30 07:16:19 compute-0 nova_compute[189265]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 07:16:19 compute-0 nova_compute[189265]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-1824333052</nova:name>
Sep 30 07:16:19 compute-0 nova_compute[189265]:       <nova:creationTime>2025-09-30 07:16:18</nova:creationTime>
Sep 30 07:16:19 compute-0 nova_compute[189265]:       <nova:flavor name="m1.nano" id="ded17455-f8fe-40c7-8dae-6f0a2b208ae0">
Sep 30 07:16:19 compute-0 nova_compute[189265]:         <nova:memory>128</nova:memory>
Sep 30 07:16:19 compute-0 nova_compute[189265]:         <nova:disk>1</nova:disk>
Sep 30 07:16:19 compute-0 nova_compute[189265]:         <nova:swap>0</nova:swap>
Sep 30 07:16:19 compute-0 nova_compute[189265]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 07:16:19 compute-0 nova_compute[189265]:         <nova:vcpus>1</nova:vcpus>
Sep 30 07:16:19 compute-0 nova_compute[189265]:         <nova:extraSpecs>
Sep 30 07:16:19 compute-0 nova_compute[189265]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 07:16:19 compute-0 nova_compute[189265]:         </nova:extraSpecs>
Sep 30 07:16:19 compute-0 nova_compute[189265]:       </nova:flavor>
Sep 30 07:16:19 compute-0 nova_compute[189265]:       <nova:image uuid="0c6b92f5-9861-49e4-862d-3ffd84520dfa">
Sep 30 07:16:19 compute-0 nova_compute[189265]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 07:16:19 compute-0 nova_compute[189265]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 07:16:19 compute-0 nova_compute[189265]:         <nova:minDisk>1</nova:minDisk>
Sep 30 07:16:19 compute-0 nova_compute[189265]:         <nova:minRam>0</nova:minRam>
Sep 30 07:16:19 compute-0 nova_compute[189265]:         <nova:properties>
Sep 30 07:16:19 compute-0 nova_compute[189265]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 07:16:19 compute-0 nova_compute[189265]:         </nova:properties>
Sep 30 07:16:19 compute-0 nova_compute[189265]:       </nova:image>
Sep 30 07:16:19 compute-0 nova_compute[189265]:       <nova:owner>
Sep 30 07:16:19 compute-0 nova_compute[189265]:         <nova:user uuid="d6cb6be5d6fc407eb3abc1c7c70f5d77">tempest-TestExecuteActionsViaActuator-2061885601-project-admin</nova:user>
Sep 30 07:16:19 compute-0 nova_compute[189265]:         <nova:project uuid="1413b21c2db845e58d8a81f524a55f3a">tempest-TestExecuteActionsViaActuator-2061885601</nova:project>
Sep 30 07:16:19 compute-0 nova_compute[189265]:       </nova:owner>
Sep 30 07:16:19 compute-0 nova_compute[189265]:       <nova:root type="image" uuid="0c6b92f5-9861-49e4-862d-3ffd84520dfa"/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:       <nova:ports>
Sep 30 07:16:19 compute-0 nova_compute[189265]:         <nova:port uuid="dd1a8613-e62a-44c6-9960-a46776a2c059">
Sep 30 07:16:19 compute-0 nova_compute[189265]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:         </nova:port>
Sep 30 07:16:19 compute-0 nova_compute[189265]:       </nova:ports>
Sep 30 07:16:19 compute-0 nova_compute[189265]:     </nova:instance>
Sep 30 07:16:19 compute-0 nova_compute[189265]:   </metadata>
Sep 30 07:16:19 compute-0 nova_compute[189265]:   <sysinfo type="smbios">
Sep 30 07:16:19 compute-0 nova_compute[189265]:     <system>
Sep 30 07:16:19 compute-0 nova_compute[189265]:       <entry name="manufacturer">RDO</entry>
Sep 30 07:16:19 compute-0 nova_compute[189265]:       <entry name="product">OpenStack Compute</entry>
Sep 30 07:16:19 compute-0 nova_compute[189265]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 07:16:19 compute-0 nova_compute[189265]:       <entry name="serial">d40a0fba-a20e-4dcf-a048-10d9e21c6cf6</entry>
Sep 30 07:16:19 compute-0 nova_compute[189265]:       <entry name="uuid">d40a0fba-a20e-4dcf-a048-10d9e21c6cf6</entry>
Sep 30 07:16:19 compute-0 nova_compute[189265]:       <entry name="family">Virtual Machine</entry>
Sep 30 07:16:19 compute-0 nova_compute[189265]:     </system>
Sep 30 07:16:19 compute-0 nova_compute[189265]:   </sysinfo>
Sep 30 07:16:19 compute-0 nova_compute[189265]:   <os>
Sep 30 07:16:19 compute-0 nova_compute[189265]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 07:16:19 compute-0 nova_compute[189265]:     <boot dev="hd"/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:     <smbios mode="sysinfo"/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:   </os>
Sep 30 07:16:19 compute-0 nova_compute[189265]:   <features>
Sep 30 07:16:19 compute-0 nova_compute[189265]:     <acpi/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:     <apic/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:     <vmcoreinfo/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:   </features>
Sep 30 07:16:19 compute-0 nova_compute[189265]:   <clock offset="utc">
Sep 30 07:16:19 compute-0 nova_compute[189265]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:     <timer name="hpet" present="no"/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:   </clock>
Sep 30 07:16:19 compute-0 nova_compute[189265]:   <cpu mode="host-model" match="exact">
Sep 30 07:16:19 compute-0 nova_compute[189265]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:   </cpu>
Sep 30 07:16:19 compute-0 nova_compute[189265]:   <devices>
Sep 30 07:16:19 compute-0 nova_compute[189265]:     <disk type="file" device="disk">
Sep 30 07:16:19 compute-0 nova_compute[189265]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:       <source file="/var/lib/nova/instances/d40a0fba-a20e-4dcf-a048-10d9e21c6cf6/disk"/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:       <target dev="vda" bus="virtio"/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:     </disk>
Sep 30 07:16:19 compute-0 nova_compute[189265]:     <disk type="file" device="cdrom">
Sep 30 07:16:19 compute-0 nova_compute[189265]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:       <source file="/var/lib/nova/instances/d40a0fba-a20e-4dcf-a048-10d9e21c6cf6/disk.config"/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:       <target dev="sda" bus="sata"/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:     </disk>
Sep 30 07:16:19 compute-0 nova_compute[189265]:     <interface type="ethernet">
Sep 30 07:16:19 compute-0 nova_compute[189265]:       <mac address="fa:16:3e:ac:12:53"/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:       <model type="virtio"/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:       <mtu size="1442"/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:       <target dev="tapdd1a8613-e6"/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:     </interface>
Sep 30 07:16:19 compute-0 nova_compute[189265]:     <serial type="pty">
Sep 30 07:16:19 compute-0 nova_compute[189265]:       <log file="/var/lib/nova/instances/d40a0fba-a20e-4dcf-a048-10d9e21c6cf6/console.log" append="off"/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:     </serial>
Sep 30 07:16:19 compute-0 nova_compute[189265]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:     <video>
Sep 30 07:16:19 compute-0 nova_compute[189265]:       <model type="virtio"/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:     </video>
Sep 30 07:16:19 compute-0 nova_compute[189265]:     <input type="tablet" bus="usb"/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:     <rng model="virtio">
Sep 30 07:16:19 compute-0 nova_compute[189265]:       <backend model="random">/dev/urandom</backend>
Sep 30 07:16:19 compute-0 nova_compute[189265]:     </rng>
Sep 30 07:16:19 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root"/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:     <controller type="usb" index="0"/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 07:16:19 compute-0 nova_compute[189265]:       <stats period="10"/>
Sep 30 07:16:19 compute-0 nova_compute[189265]:     </memballoon>
Sep 30 07:16:19 compute-0 nova_compute[189265]:   </devices>
Sep 30 07:16:19 compute-0 nova_compute[189265]: </domain>
Sep 30 07:16:19 compute-0 nova_compute[189265]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Sep 30 07:16:19 compute-0 nova_compute[189265]: 2025-09-30 07:16:19.073 2 DEBUG nova.compute.manager [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Preparing to wait for external event network-vif-plugged-dd1a8613-e62a-44c6-9960-a46776a2c059 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 07:16:19 compute-0 nova_compute[189265]: 2025-09-30 07:16:19.073 2 DEBUG oslo_concurrency.lockutils [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Acquiring lock "d40a0fba-a20e-4dcf-a048-10d9e21c6cf6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:16:19 compute-0 nova_compute[189265]: 2025-09-30 07:16:19.074 2 DEBUG oslo_concurrency.lockutils [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "d40a0fba-a20e-4dcf-a048-10d9e21c6cf6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:16:19 compute-0 nova_compute[189265]: 2025-09-30 07:16:19.074 2 DEBUG oslo_concurrency.lockutils [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "d40a0fba-a20e-4dcf-a048-10d9e21c6cf6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:16:19 compute-0 nova_compute[189265]: 2025-09-30 07:16:19.075 2 DEBUG nova.virt.libvirt.vif [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T07:16:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1824333052',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1824333052',id=5,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1413b21c2db845e58d8a81f524a55f3a',ramdisk_id='',reservation_id='r-o0ztmckl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-2061885601',owner_user_name='tempest-TestExecuteActionsViaActuator-2061885601-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T07:16:12Z,user_data=None,user_id='d6cb6be5d6fc407eb3abc1c7c70f5d77',uuid=d40a0fba-a20e-4dcf-a048-10d9e21c6cf6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dd1a8613-e62a-44c6-9960-a46776a2c059", "address": "fa:16:3e:ac:12:53", "network": {"id": "74ffbf65-ebbd-4587-bf5b-0b38421a4813", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1315246804-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1dc2a906d2242f79ffab81c2cf3c4d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd1a8613-e6", "ovs_interfaceid": "dd1a8613-e62a-44c6-9960-a46776a2c059", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 07:16:19 compute-0 nova_compute[189265]: 2025-09-30 07:16:19.075 2 DEBUG nova.network.os_vif_util [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Converting VIF {"id": "dd1a8613-e62a-44c6-9960-a46776a2c059", "address": "fa:16:3e:ac:12:53", "network": {"id": "74ffbf65-ebbd-4587-bf5b-0b38421a4813", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1315246804-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1dc2a906d2242f79ffab81c2cf3c4d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd1a8613-e6", "ovs_interfaceid": "dd1a8613-e62a-44c6-9960-a46776a2c059", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:16:19 compute-0 nova_compute[189265]: 2025-09-30 07:16:19.075 2 DEBUG nova.network.os_vif_util [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ac:12:53,bridge_name='br-int',has_traffic_filtering=True,id=dd1a8613-e62a-44c6-9960-a46776a2c059,network=Network(74ffbf65-ebbd-4587-bf5b-0b38421a4813),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd1a8613-e6') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:16:19 compute-0 nova_compute[189265]: 2025-09-30 07:16:19.076 2 DEBUG os_vif [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:12:53,bridge_name='br-int',has_traffic_filtering=True,id=dd1a8613-e62a-44c6-9960-a46776a2c059,network=Network(74ffbf65-ebbd-4587-bf5b-0b38421a4813),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd1a8613-e6') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 07:16:19 compute-0 nova_compute[189265]: 2025-09-30 07:16:19.076 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:16:19 compute-0 nova_compute[189265]: 2025-09-30 07:16:19.076 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:16:19 compute-0 nova_compute[189265]: 2025-09-30 07:16:19.077 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:16:19 compute-0 nova_compute[189265]: 2025-09-30 07:16:19.077 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:16:19 compute-0 nova_compute[189265]: 2025-09-30 07:16:19.078 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'c22e70f2-6678-538f-825e-adf419b60559', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:16:19 compute-0 nova_compute[189265]: 2025-09-30 07:16:19.079 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:16:19 compute-0 nova_compute[189265]: 2025-09-30 07:16:19.080 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:16:19 compute-0 nova_compute[189265]: 2025-09-30 07:16:19.082 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:16:19 compute-0 nova_compute[189265]: 2025-09-30 07:16:19.082 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdd1a8613-e6, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:16:19 compute-0 nova_compute[189265]: 2025-09-30 07:16:19.083 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapdd1a8613-e6, col_values=(('qos', UUID('621e1e7e-3789-4f4a-a676-70cbb85a0ddc')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:16:19 compute-0 nova_compute[189265]: 2025-09-30 07:16:19.083 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapdd1a8613-e6, col_values=(('external_ids', {'iface-id': 'dd1a8613-e62a-44c6-9960-a46776a2c059', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ac:12:53', 'vm-uuid': 'd40a0fba-a20e-4dcf-a048-10d9e21c6cf6'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:16:19 compute-0 nova_compute[189265]: 2025-09-30 07:16:19.084 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:16:19 compute-0 NetworkManager[51813]: <info>  [1759216579.0850] manager: (tapdd1a8613-e6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/26)
Sep 30 07:16:19 compute-0 nova_compute[189265]: 2025-09-30 07:16:19.085 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:16:19 compute-0 nova_compute[189265]: 2025-09-30 07:16:19.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:16:19 compute-0 nova_compute[189265]: 2025-09-30 07:16:19.094 2 INFO os_vif [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:12:53,bridge_name='br-int',has_traffic_filtering=True,id=dd1a8613-e62a-44c6-9960-a46776a2c059,network=Network(74ffbf65-ebbd-4587-bf5b-0b38421a4813),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd1a8613-e6')
Sep 30 07:16:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:16:20.540 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:16:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:16:20.540 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:16:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:16:20.541 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:16:20 compute-0 nova_compute[189265]: 2025-09-30 07:16:20.748 2 DEBUG nova.virt.libvirt.driver [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 07:16:20 compute-0 nova_compute[189265]: 2025-09-30 07:16:20.749 2 DEBUG nova.virt.libvirt.driver [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 07:16:20 compute-0 nova_compute[189265]: 2025-09-30 07:16:20.749 2 DEBUG nova.virt.libvirt.driver [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] No VIF found with MAC fa:16:3e:ac:12:53, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 07:16:20 compute-0 nova_compute[189265]: 2025-09-30 07:16:20.750 2 INFO nova.virt.libvirt.driver [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Using config drive
Sep 30 07:16:21 compute-0 nova_compute[189265]: 2025-09-30 07:16:21.271 2 WARNING neutronclient.v2_0.client [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:16:21 compute-0 nova_compute[189265]: 2025-09-30 07:16:21.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:16:22 compute-0 nova_compute[189265]: 2025-09-30 07:16:22.129 2 INFO nova.virt.libvirt.driver [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Creating config drive at /var/lib/nova/instances/d40a0fba-a20e-4dcf-a048-10d9e21c6cf6/disk.config
Sep 30 07:16:22 compute-0 nova_compute[189265]: 2025-09-30 07:16:22.142 2 DEBUG oslo_concurrency.processutils [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d40a0fba-a20e-4dcf-a048-10d9e21c6cf6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmp83_u_b5m execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:16:22 compute-0 nova_compute[189265]: 2025-09-30 07:16:22.288 2 DEBUG oslo_concurrency.processutils [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d40a0fba-a20e-4dcf-a048-10d9e21c6cf6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmp83_u_b5m" returned: 0 in 0.146s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:16:22 compute-0 kernel: tapdd1a8613-e6: entered promiscuous mode
Sep 30 07:16:22 compute-0 NetworkManager[51813]: <info>  [1759216582.3650] manager: (tapdd1a8613-e6): new Tun device (/org/freedesktop/NetworkManager/Devices/27)
Sep 30 07:16:22 compute-0 ovn_controller[91436]: 2025-09-30T07:16:22Z|00046|binding|INFO|Claiming lport dd1a8613-e62a-44c6-9960-a46776a2c059 for this chassis.
Sep 30 07:16:22 compute-0 ovn_controller[91436]: 2025-09-30T07:16:22Z|00047|binding|INFO|dd1a8613-e62a-44c6-9960-a46776a2c059: Claiming fa:16:3e:ac:12:53 10.100.0.6
Sep 30 07:16:22 compute-0 nova_compute[189265]: 2025-09-30 07:16:22.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:16:22 compute-0 systemd-udevd[213300]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 07:16:22 compute-0 ovn_controller[91436]: 2025-09-30T07:16:22Z|00048|binding|INFO|Setting lport dd1a8613-e62a-44c6-9960-a46776a2c059 ovn-installed in OVS
Sep 30 07:16:22 compute-0 nova_compute[189265]: 2025-09-30 07:16:22.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:16:22 compute-0 nova_compute[189265]: 2025-09-30 07:16:22.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:16:22 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:16:22.442 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:12:53 10.100.0.6'], port_security=['fa:16:3e:ac:12:53 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'd40a0fba-a20e-4dcf-a048-10d9e21c6cf6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74ffbf65-ebbd-4587-bf5b-0b38421a4813', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1413b21c2db845e58d8a81f524a55f3a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8ad3c6f6-3842-4d69-92ac-cef07b75c3bc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7b541691-433c-426c-b8b7-10d79319603a, chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], logical_port=dd1a8613-e62a-44c6-9960-a46776a2c059) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:16:22 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:16:22.443 100322 INFO neutron.agent.ovn.metadata.agent [-] Port dd1a8613-e62a-44c6-9960-a46776a2c059 in datapath 74ffbf65-ebbd-4587-bf5b-0b38421a4813 bound to our chassis
Sep 30 07:16:22 compute-0 NetworkManager[51813]: <info>  [1759216582.4449] device (tapdd1a8613-e6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 07:16:22 compute-0 ovn_controller[91436]: 2025-09-30T07:16:22Z|00049|binding|INFO|Setting lport dd1a8613-e62a-44c6-9960-a46776a2c059 up in Southbound
Sep 30 07:16:22 compute-0 NetworkManager[51813]: <info>  [1759216582.4477] device (tapdd1a8613-e6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 07:16:22 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:16:22.448 100322 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 74ffbf65-ebbd-4587-bf5b-0b38421a4813
Sep 30 07:16:22 compute-0 systemd-machined[149233]: New machine qemu-2-instance-00000005.
Sep 30 07:16:22 compute-0 systemd[1]: Started Virtual Machine qemu-2-instance-00000005.
Sep 30 07:16:22 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:16:22.475 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[60243145-2b6f-4c94-ae4a-1ea463af4208]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:16:22 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:16:22.521 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[ae3aed29-56bc-4440-9ca4-7579ad60aa0a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:16:22 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:16:22.525 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[15c9e647-56b2-4f92-b4d7-a1b05a31a4ba]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:16:22 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:16:22.568 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[950f2268-b27c-441e-ac3f-7a45d36508fb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:16:22 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:16:22.595 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[f8a069fd-919f-4f5c-a590-a51b0546e342]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap74ffbf65-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1f:ef:af'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 434702, 'reachable_time': 35230, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213317, 'error': None, 'target': 'ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:16:22 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:16:22.622 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[02325cd2-210d-4c6d-9514-614304689f96]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap74ffbf65-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 434715, 'tstamp': 434715}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213318, 'error': None, 'target': 'ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap74ffbf65-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 434718, 'tstamp': 434718}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213318, 'error': None, 'target': 'ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:16:22 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:16:22.624 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap74ffbf65-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:16:22 compute-0 nova_compute[189265]: 2025-09-30 07:16:22.627 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:16:22 compute-0 nova_compute[189265]: 2025-09-30 07:16:22.628 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:16:22 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:16:22.629 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap74ffbf65-e0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:16:22 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:16:22.629 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:16:22 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:16:22.630 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap74ffbf65-e0, col_values=(('external_ids', {'iface-id': '0c700e20-e593-4a77-93d7-fc919dc1f294'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:16:22 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:16:22.630 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:16:22 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:16:22.632 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[8fb65155-f95b-475b-9243-cb68207e8a2b]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-74ffbf65-ebbd-4587-bf5b-0b38421a4813\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/74ffbf65-ebbd-4587-bf5b-0b38421a4813.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 74ffbf65-ebbd-4587-bf5b-0b38421a4813\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:16:23 compute-0 nova_compute[189265]: 2025-09-30 07:16:23.380 2 DEBUG nova.compute.manager [req-97db4c12-bd21-468c-88a2-b8f70515553b req-88dee81b-5b9c-4a85-8d72-a8a81235391d 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Received event network-vif-plugged-dd1a8613-e62a-44c6-9960-a46776a2c059 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:16:23 compute-0 nova_compute[189265]: 2025-09-30 07:16:23.380 2 DEBUG oslo_concurrency.lockutils [req-97db4c12-bd21-468c-88a2-b8f70515553b req-88dee81b-5b9c-4a85-8d72-a8a81235391d 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "d40a0fba-a20e-4dcf-a048-10d9e21c6cf6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:16:23 compute-0 nova_compute[189265]: 2025-09-30 07:16:23.381 2 DEBUG oslo_concurrency.lockutils [req-97db4c12-bd21-468c-88a2-b8f70515553b req-88dee81b-5b9c-4a85-8d72-a8a81235391d 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "d40a0fba-a20e-4dcf-a048-10d9e21c6cf6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:16:23 compute-0 nova_compute[189265]: 2025-09-30 07:16:23.381 2 DEBUG oslo_concurrency.lockutils [req-97db4c12-bd21-468c-88a2-b8f70515553b req-88dee81b-5b9c-4a85-8d72-a8a81235391d 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "d40a0fba-a20e-4dcf-a048-10d9e21c6cf6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:16:23 compute-0 nova_compute[189265]: 2025-09-30 07:16:23.381 2 DEBUG nova.compute.manager [req-97db4c12-bd21-468c-88a2-b8f70515553b req-88dee81b-5b9c-4a85-8d72-a8a81235391d 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Processing event network-vif-plugged-dd1a8613-e62a-44c6-9960-a46776a2c059 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 07:16:23 compute-0 nova_compute[189265]: 2025-09-30 07:16:23.382 2 DEBUG nova.compute.manager [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 07:16:23 compute-0 nova_compute[189265]: 2025-09-30 07:16:23.386 2 DEBUG nova.virt.libvirt.driver [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Sep 30 07:16:23 compute-0 nova_compute[189265]: 2025-09-30 07:16:23.390 2 INFO nova.virt.libvirt.driver [-] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Instance spawned successfully.
Sep 30 07:16:23 compute-0 nova_compute[189265]: 2025-09-30 07:16:23.391 2 DEBUG nova.virt.libvirt.driver [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Sep 30 07:16:23 compute-0 nova_compute[189265]: 2025-09-30 07:16:23.947 2 DEBUG nova.virt.libvirt.driver [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:16:23 compute-0 nova_compute[189265]: 2025-09-30 07:16:23.947 2 DEBUG nova.virt.libvirt.driver [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:16:23 compute-0 nova_compute[189265]: 2025-09-30 07:16:23.948 2 DEBUG nova.virt.libvirt.driver [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:16:23 compute-0 nova_compute[189265]: 2025-09-30 07:16:23.948 2 DEBUG nova.virt.libvirt.driver [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:16:23 compute-0 nova_compute[189265]: 2025-09-30 07:16:23.949 2 DEBUG nova.virt.libvirt.driver [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:16:23 compute-0 nova_compute[189265]: 2025-09-30 07:16:23.949 2 DEBUG nova.virt.libvirt.driver [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:16:24 compute-0 nova_compute[189265]: 2025-09-30 07:16:24.085 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:16:24 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:16:24.228 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '1a:26:7c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2e:60:fa:91:d0:34'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:16:24 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:16:24.230 100322 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 07:16:24 compute-0 nova_compute[189265]: 2025-09-30 07:16:24.233 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:16:24 compute-0 nova_compute[189265]: 2025-09-30 07:16:24.484 2 INFO nova.compute.manager [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Took 11.39 seconds to spawn the instance on the hypervisor.
Sep 30 07:16:24 compute-0 nova_compute[189265]: 2025-09-30 07:16:24.485 2 DEBUG nova.compute.manager [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 07:16:25 compute-0 nova_compute[189265]: 2025-09-30 07:16:25.068 2 INFO nova.compute.manager [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Took 16.88 seconds to build instance.
Sep 30 07:16:25 compute-0 nova_compute[189265]: 2025-09-30 07:16:25.468 2 DEBUG nova.compute.manager [req-e65f7dac-56d4-4f8b-a07b-f42859ec369c req-506b252b-73e2-49fc-9170-03bc2c9ff51b 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Received event network-vif-plugged-dd1a8613-e62a-44c6-9960-a46776a2c059 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:16:25 compute-0 nova_compute[189265]: 2025-09-30 07:16:25.469 2 DEBUG oslo_concurrency.lockutils [req-e65f7dac-56d4-4f8b-a07b-f42859ec369c req-506b252b-73e2-49fc-9170-03bc2c9ff51b 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "d40a0fba-a20e-4dcf-a048-10d9e21c6cf6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:16:25 compute-0 nova_compute[189265]: 2025-09-30 07:16:25.470 2 DEBUG oslo_concurrency.lockutils [req-e65f7dac-56d4-4f8b-a07b-f42859ec369c req-506b252b-73e2-49fc-9170-03bc2c9ff51b 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "d40a0fba-a20e-4dcf-a048-10d9e21c6cf6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:16:25 compute-0 nova_compute[189265]: 2025-09-30 07:16:25.470 2 DEBUG oslo_concurrency.lockutils [req-e65f7dac-56d4-4f8b-a07b-f42859ec369c req-506b252b-73e2-49fc-9170-03bc2c9ff51b 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "d40a0fba-a20e-4dcf-a048-10d9e21c6cf6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:16:25 compute-0 nova_compute[189265]: 2025-09-30 07:16:25.470 2 DEBUG nova.compute.manager [req-e65f7dac-56d4-4f8b-a07b-f42859ec369c req-506b252b-73e2-49fc-9170-03bc2c9ff51b 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] No waiting events found dispatching network-vif-plugged-dd1a8613-e62a-44c6-9960-a46776a2c059 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:16:25 compute-0 nova_compute[189265]: 2025-09-30 07:16:25.470 2 WARNING nova.compute.manager [req-e65f7dac-56d4-4f8b-a07b-f42859ec369c req-506b252b-73e2-49fc-9170-03bc2c9ff51b 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Received unexpected event network-vif-plugged-dd1a8613-e62a-44c6-9960-a46776a2c059 for instance with vm_state active and task_state None.
Sep 30 07:16:25 compute-0 nova_compute[189265]: 2025-09-30 07:16:25.608 2 DEBUG oslo_concurrency.lockutils [None req-6fb6eed4-0131-4160-a3a7-b4f180509d0a d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "d40a0fba-a20e-4dcf-a048-10d9e21c6cf6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.445s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:16:26 compute-0 nova_compute[189265]: 2025-09-30 07:16:26.796 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:16:29 compute-0 nova_compute[189265]: 2025-09-30 07:16:29.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:16:29 compute-0 podman[199733]: time="2025-09-30T07:16:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:16:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:16:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 07:16:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:16:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3458 "" "Go-http-client/1.1"
Sep 30 07:16:30 compute-0 nova_compute[189265]: 2025-09-30 07:16:30.784 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:16:31 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:16:31.232 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=01429670-4ea1-4dab-babc-4bc628cc01bb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:16:31 compute-0 openstack_network_exporter[201859]: ERROR   07:16:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:16:31 compute-0 openstack_network_exporter[201859]: ERROR   07:16:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:16:31 compute-0 openstack_network_exporter[201859]: ERROR   07:16:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:16:31 compute-0 openstack_network_exporter[201859]: ERROR   07:16:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:16:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:16:31 compute-0 openstack_network_exporter[201859]: ERROR   07:16:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:16:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:16:31 compute-0 podman[213327]: 2025-09-30 07:16:31.497713704 +0000 UTC m=+0.067510137 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 07:16:31 compute-0 nova_compute[189265]: 2025-09-30 07:16:31.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:16:31 compute-0 nova_compute[189265]: 2025-09-30 07:16:31.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:16:33 compute-0 nova_compute[189265]: 2025-09-30 07:16:33.789 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:16:33 compute-0 nova_compute[189265]: 2025-09-30 07:16:33.789 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:16:34 compute-0 nova_compute[189265]: 2025-09-30 07:16:34.092 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:16:34 compute-0 nova_compute[189265]: 2025-09-30 07:16:34.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:16:34 compute-0 nova_compute[189265]: 2025-09-30 07:16:34.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:16:34 compute-0 nova_compute[189265]: 2025-09-30 07:16:34.789 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:16:34 compute-0 nova_compute[189265]: 2025-09-30 07:16:34.789 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:16:35 compute-0 nova_compute[189265]: 2025-09-30 07:16:35.359 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:16:35 compute-0 nova_compute[189265]: 2025-09-30 07:16:35.360 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:16:35 compute-0 nova_compute[189265]: 2025-09-30 07:16:35.360 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:16:35 compute-0 nova_compute[189265]: 2025-09-30 07:16:35.360 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:16:35 compute-0 ovn_controller[91436]: 2025-09-30T07:16:35Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ac:12:53 10.100.0.6
Sep 30 07:16:35 compute-0 ovn_controller[91436]: 2025-09-30T07:16:35Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ac:12:53 10.100.0.6
Sep 30 07:16:36 compute-0 nova_compute[189265]: 2025-09-30 07:16:36.441 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d40a0fba-a20e-4dcf-a048-10d9e21c6cf6/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:16:36 compute-0 nova_compute[189265]: 2025-09-30 07:16:36.512 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d40a0fba-a20e-4dcf-a048-10d9e21c6cf6/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:16:36 compute-0 nova_compute[189265]: 2025-09-30 07:16:36.513 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d40a0fba-a20e-4dcf-a048-10d9e21c6cf6/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:16:36 compute-0 nova_compute[189265]: 2025-09-30 07:16:36.570 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d40a0fba-a20e-4dcf-a048-10d9e21c6cf6/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:16:36 compute-0 nova_compute[189265]: 2025-09-30 07:16:36.574 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9fa193fb-a398-4552-85b4-a346dffcf697/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:16:36 compute-0 nova_compute[189265]: 2025-09-30 07:16:36.620 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9fa193fb-a398-4552-85b4-a346dffcf697/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:16:36 compute-0 nova_compute[189265]: 2025-09-30 07:16:36.621 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9fa193fb-a398-4552-85b4-a346dffcf697/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:16:36 compute-0 nova_compute[189265]: 2025-09-30 07:16:36.669 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9fa193fb-a398-4552-85b4-a346dffcf697/disk --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:16:36 compute-0 nova_compute[189265]: 2025-09-30 07:16:36.781 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:16:36 compute-0 nova_compute[189265]: 2025-09-30 07:16:36.783 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:16:36 compute-0 nova_compute[189265]: 2025-09-30 07:16:36.801 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:16:36 compute-0 nova_compute[189265]: 2025-09-30 07:16:36.809 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.026s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:16:36 compute-0 nova_compute[189265]: 2025-09-30 07:16:36.810 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5565MB free_disk=73.25091171264648GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:16:36 compute-0 nova_compute[189265]: 2025-09-30 07:16:36.811 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:16:36 compute-0 nova_compute[189265]: 2025-09-30 07:16:36.811 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:16:38 compute-0 nova_compute[189265]: 2025-09-30 07:16:38.762 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Instance 9fa193fb-a398-4552-85b4-a346dffcf697 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 07:16:38 compute-0 nova_compute[189265]: 2025-09-30 07:16:38.762 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Instance d40a0fba-a20e-4dcf-a048-10d9e21c6cf6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 07:16:38 compute-0 nova_compute[189265]: 2025-09-30 07:16:38.763 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:16:38 compute-0 nova_compute[189265]: 2025-09-30 07:16:38.763 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:16:36 up  1:14,  0 user,  load average: 0.50, 0.35, 0.46\n', 'num_instances': '2', 'num_vm_active': '2', 'num_task_None': '2', 'num_os_type_None': '2', 'num_proj_1413b21c2db845e58d8a81f524a55f3a': '2', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:16:38 compute-0 nova_compute[189265]: 2025-09-30 07:16:38.823 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:16:39 compute-0 nova_compute[189265]: 2025-09-30 07:16:39.094 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:16:39 compute-0 nova_compute[189265]: 2025-09-30 07:16:39.354 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:16:39 compute-0 nova_compute[189265]: 2025-09-30 07:16:39.877 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:16:39 compute-0 nova_compute[189265]: 2025-09-30 07:16:39.878 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.066s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:16:41 compute-0 podman[213373]: 2025-09-30 07:16:41.533013571 +0000 UTC m=+0.101109967 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid, tcib_build_tag=watcher_latest, config_id=iscsid)
Sep 30 07:16:41 compute-0 nova_compute[189265]: 2025-09-30 07:16:41.801 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:16:43 compute-0 nova_compute[189265]: 2025-09-30 07:16:43.877 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:16:44 compute-0 nova_compute[189265]: 2025-09-30 07:16:44.097 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:16:45 compute-0 podman[213393]: 2025-09-30 07:16:45.526701088 +0000 UTC m=+0.093185398 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, managed_by=edpm_ansible, distribution-scope=public, name=ubi9-minimal, version=9.6, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., release=1755695350, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container)
Sep 30 07:16:46 compute-0 nova_compute[189265]: 2025-09-30 07:16:46.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:16:47 compute-0 podman[213414]: 2025-09-30 07:16:47.562303812 +0000 UTC m=+0.139029878 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 07:16:47 compute-0 podman[213415]: 2025-09-30 07:16:47.581206247 +0000 UTC m=+0.163132353 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Sep 30 07:16:49 compute-0 nova_compute[189265]: 2025-09-30 07:16:49.099 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:16:49 compute-0 podman[213461]: 2025-09-30 07:16:49.492454049 +0000 UTC m=+0.067418645 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 07:16:51 compute-0 nova_compute[189265]: 2025-09-30 07:16:51.807 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:16:54 compute-0 nova_compute[189265]: 2025-09-30 07:16:54.101 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:16:56 compute-0 nova_compute[189265]: 2025-09-30 07:16:56.810 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:16:56 compute-0 nova_compute[189265]: 2025-09-30 07:16:56.946 2 DEBUG oslo_concurrency.lockutils [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Acquiring lock "992a0681-bc5e-40b3-adf3-305eee0718fd" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:16:56 compute-0 nova_compute[189265]: 2025-09-30 07:16:56.946 2 DEBUG oslo_concurrency.lockutils [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "992a0681-bc5e-40b3-adf3-305eee0718fd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:16:57 compute-0 nova_compute[189265]: 2025-09-30 07:16:57.454 2 DEBUG nova.compute.manager [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Sep 30 07:16:57 compute-0 nova_compute[189265]: 2025-09-30 07:16:57.996 2 DEBUG oslo_concurrency.lockutils [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:16:57 compute-0 nova_compute[189265]: 2025-09-30 07:16:57.997 2 DEBUG oslo_concurrency.lockutils [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:16:58 compute-0 nova_compute[189265]: 2025-09-30 07:16:58.005 2 DEBUG nova.virt.hardware [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Sep 30 07:16:58 compute-0 nova_compute[189265]: 2025-09-30 07:16:58.005 2 INFO nova.compute.claims [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Claim successful on node compute-0.ctlplane.example.com
Sep 30 07:16:59 compute-0 nova_compute[189265]: 2025-09-30 07:16:59.091 2 DEBUG nova.compute.provider_tree [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:16:59 compute-0 nova_compute[189265]: 2025-09-30 07:16:59.103 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:16:59 compute-0 nova_compute[189265]: 2025-09-30 07:16:59.600 2 DEBUG nova.scheduler.client.report [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:16:59 compute-0 podman[199733]: time="2025-09-30T07:16:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:16:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:16:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 07:16:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:16:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3461 "" "Go-http-client/1.1"
Sep 30 07:17:00 compute-0 nova_compute[189265]: 2025-09-30 07:17:00.113 2 DEBUG oslo_concurrency.lockutils [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.116s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:17:00 compute-0 nova_compute[189265]: 2025-09-30 07:17:00.114 2 DEBUG nova.compute.manager [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Sep 30 07:17:00 compute-0 nova_compute[189265]: 2025-09-30 07:17:00.624 2 DEBUG nova.compute.manager [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Sep 30 07:17:00 compute-0 nova_compute[189265]: 2025-09-30 07:17:00.624 2 DEBUG nova.network.neutron [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Sep 30 07:17:00 compute-0 nova_compute[189265]: 2025-09-30 07:17:00.625 2 WARNING neutronclient.v2_0.client [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:17:00 compute-0 nova_compute[189265]: 2025-09-30 07:17:00.625 2 WARNING neutronclient.v2_0.client [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:17:01 compute-0 nova_compute[189265]: 2025-09-30 07:17:01.133 2 INFO nova.virt.libvirt.driver [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 07:17:01 compute-0 openstack_network_exporter[201859]: ERROR   07:17:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:17:01 compute-0 openstack_network_exporter[201859]: ERROR   07:17:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:17:01 compute-0 openstack_network_exporter[201859]: ERROR   07:17:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:17:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:17:01 compute-0 openstack_network_exporter[201859]: ERROR   07:17:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:17:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:17:01 compute-0 openstack_network_exporter[201859]: ERROR   07:17:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:17:01 compute-0 nova_compute[189265]: 2025-09-30 07:17:01.650 2 DEBUG nova.compute.manager [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Sep 30 07:17:01 compute-0 nova_compute[189265]: 2025-09-30 07:17:01.811 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:02 compute-0 nova_compute[189265]: 2025-09-30 07:17:02.463 2 DEBUG nova.network.neutron [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Successfully created port: a38c613b-6d8b-4dc3-96e4-4a4103b20d91 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Sep 30 07:17:02 compute-0 podman[213483]: 2025-09-30 07:17:02.510876975 +0000 UTC m=+0.076307121 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 07:17:02 compute-0 nova_compute[189265]: 2025-09-30 07:17:02.710 2 DEBUG nova.compute.manager [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Sep 30 07:17:02 compute-0 nova_compute[189265]: 2025-09-30 07:17:02.712 2 DEBUG nova.virt.libvirt.driver [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Sep 30 07:17:02 compute-0 nova_compute[189265]: 2025-09-30 07:17:02.712 2 INFO nova.virt.libvirt.driver [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Creating image(s)
Sep 30 07:17:02 compute-0 nova_compute[189265]: 2025-09-30 07:17:02.713 2 DEBUG oslo_concurrency.lockutils [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Acquiring lock "/var/lib/nova/instances/992a0681-bc5e-40b3-adf3-305eee0718fd/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:17:02 compute-0 nova_compute[189265]: 2025-09-30 07:17:02.713 2 DEBUG oslo_concurrency.lockutils [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "/var/lib/nova/instances/992a0681-bc5e-40b3-adf3-305eee0718fd/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:17:02 compute-0 nova_compute[189265]: 2025-09-30 07:17:02.714 2 DEBUG oslo_concurrency.lockutils [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "/var/lib/nova/instances/992a0681-bc5e-40b3-adf3-305eee0718fd/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:17:02 compute-0 nova_compute[189265]: 2025-09-30 07:17:02.714 2 DEBUG oslo_utils.imageutils.format_inspector [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:17:02 compute-0 nova_compute[189265]: 2025-09-30 07:17:02.718 2 DEBUG oslo_utils.imageutils.format_inspector [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:17:02 compute-0 nova_compute[189265]: 2025-09-30 07:17:02.721 2 DEBUG oslo_concurrency.processutils [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:17:02 compute-0 nova_compute[189265]: 2025-09-30 07:17:02.816 2 DEBUG oslo_concurrency.processutils [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:17:02 compute-0 nova_compute[189265]: 2025-09-30 07:17:02.818 2 DEBUG oslo_concurrency.lockutils [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Acquiring lock "649c128805005f3dfb5a93843c58a367cdfe939d" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:17:02 compute-0 nova_compute[189265]: 2025-09-30 07:17:02.818 2 DEBUG oslo_concurrency.lockutils [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "649c128805005f3dfb5a93843c58a367cdfe939d" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:17:02 compute-0 nova_compute[189265]: 2025-09-30 07:17:02.819 2 DEBUG oslo_utils.imageutils.format_inspector [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:17:02 compute-0 nova_compute[189265]: 2025-09-30 07:17:02.825 2 DEBUG oslo_utils.imageutils.format_inspector [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:17:02 compute-0 nova_compute[189265]: 2025-09-30 07:17:02.826 2 DEBUG oslo_concurrency.processutils [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:17:02 compute-0 nova_compute[189265]: 2025-09-30 07:17:02.891 2 DEBUG oslo_concurrency.processutils [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:17:02 compute-0 nova_compute[189265]: 2025-09-30 07:17:02.893 2 DEBUG oslo_concurrency.processutils [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d,backing_fmt=raw /var/lib/nova/instances/992a0681-bc5e-40b3-adf3-305eee0718fd/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:17:02 compute-0 nova_compute[189265]: 2025-09-30 07:17:02.936 2 DEBUG oslo_concurrency.processutils [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d,backing_fmt=raw /var/lib/nova/instances/992a0681-bc5e-40b3-adf3-305eee0718fd/disk 1073741824" returned: 0 in 0.043s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:17:02 compute-0 nova_compute[189265]: 2025-09-30 07:17:02.937 2 DEBUG oslo_concurrency.lockutils [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "649c128805005f3dfb5a93843c58a367cdfe939d" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.118s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:17:02 compute-0 nova_compute[189265]: 2025-09-30 07:17:02.937 2 DEBUG oslo_concurrency.processutils [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:17:02 compute-0 nova_compute[189265]: 2025-09-30 07:17:02.984 2 DEBUG oslo_concurrency.processutils [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.047s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:17:02 compute-0 nova_compute[189265]: 2025-09-30 07:17:02.985 2 DEBUG nova.virt.disk.api [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Checking if we can resize image /var/lib/nova/instances/992a0681-bc5e-40b3-adf3-305eee0718fd/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Sep 30 07:17:02 compute-0 nova_compute[189265]: 2025-09-30 07:17:02.985 2 DEBUG oslo_concurrency.processutils [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/992a0681-bc5e-40b3-adf3-305eee0718fd/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:17:03 compute-0 nova_compute[189265]: 2025-09-30 07:17:03.031 2 DEBUG oslo_concurrency.processutils [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/992a0681-bc5e-40b3-adf3-305eee0718fd/disk --force-share --output=json" returned: 0 in 0.046s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:17:03 compute-0 nova_compute[189265]: 2025-09-30 07:17:03.032 2 DEBUG nova.virt.disk.api [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Cannot resize image /var/lib/nova/instances/992a0681-bc5e-40b3-adf3-305eee0718fd/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Sep 30 07:17:03 compute-0 nova_compute[189265]: 2025-09-30 07:17:03.032 2 DEBUG nova.virt.libvirt.driver [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Sep 30 07:17:03 compute-0 nova_compute[189265]: 2025-09-30 07:17:03.032 2 DEBUG nova.virt.libvirt.driver [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Ensure instance console log exists: /var/lib/nova/instances/992a0681-bc5e-40b3-adf3-305eee0718fd/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 07:17:03 compute-0 nova_compute[189265]: 2025-09-30 07:17:03.033 2 DEBUG oslo_concurrency.lockutils [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:17:03 compute-0 nova_compute[189265]: 2025-09-30 07:17:03.033 2 DEBUG oslo_concurrency.lockutils [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:17:03 compute-0 nova_compute[189265]: 2025-09-30 07:17:03.033 2 DEBUG oslo_concurrency.lockutils [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:17:04 compute-0 nova_compute[189265]: 2025-09-30 07:17:04.106 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:04 compute-0 nova_compute[189265]: 2025-09-30 07:17:04.328 2 DEBUG nova.network.neutron [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Successfully updated port: a38c613b-6d8b-4dc3-96e4-4a4103b20d91 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Sep 30 07:17:04 compute-0 nova_compute[189265]: 2025-09-30 07:17:04.415 2 DEBUG nova.compute.manager [req-39a97e4a-1505-46e7-8bb4-05dd89c65673 req-a3e984e9-e5ce-40bb-89a9-ef01a82ea075 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Received event network-changed-a38c613b-6d8b-4dc3-96e4-4a4103b20d91 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:17:04 compute-0 nova_compute[189265]: 2025-09-30 07:17:04.416 2 DEBUG nova.compute.manager [req-39a97e4a-1505-46e7-8bb4-05dd89c65673 req-a3e984e9-e5ce-40bb-89a9-ef01a82ea075 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Refreshing instance network info cache due to event network-changed-a38c613b-6d8b-4dc3-96e4-4a4103b20d91. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 07:17:04 compute-0 nova_compute[189265]: 2025-09-30 07:17:04.416 2 DEBUG oslo_concurrency.lockutils [req-39a97e4a-1505-46e7-8bb4-05dd89c65673 req-a3e984e9-e5ce-40bb-89a9-ef01a82ea075 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "refresh_cache-992a0681-bc5e-40b3-adf3-305eee0718fd" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:17:04 compute-0 nova_compute[189265]: 2025-09-30 07:17:04.417 2 DEBUG oslo_concurrency.lockutils [req-39a97e4a-1505-46e7-8bb4-05dd89c65673 req-a3e984e9-e5ce-40bb-89a9-ef01a82ea075 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquired lock "refresh_cache-992a0681-bc5e-40b3-adf3-305eee0718fd" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:17:04 compute-0 nova_compute[189265]: 2025-09-30 07:17:04.417 2 DEBUG nova.network.neutron [req-39a97e4a-1505-46e7-8bb4-05dd89c65673 req-a3e984e9-e5ce-40bb-89a9-ef01a82ea075 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Refreshing network info cache for port a38c613b-6d8b-4dc3-96e4-4a4103b20d91 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 07:17:04 compute-0 nova_compute[189265]: 2025-09-30 07:17:04.836 2 DEBUG oslo_concurrency.lockutils [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Acquiring lock "refresh_cache-992a0681-bc5e-40b3-adf3-305eee0718fd" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:17:04 compute-0 nova_compute[189265]: 2025-09-30 07:17:04.924 2 WARNING neutronclient.v2_0.client [req-39a97e4a-1505-46e7-8bb4-05dd89c65673 req-a3e984e9-e5ce-40bb-89a9-ef01a82ea075 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:17:05 compute-0 nova_compute[189265]: 2025-09-30 07:17:05.177 2 DEBUG nova.network.neutron [req-39a97e4a-1505-46e7-8bb4-05dd89c65673 req-a3e984e9-e5ce-40bb-89a9-ef01a82ea075 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 07:17:05 compute-0 nova_compute[189265]: 2025-09-30 07:17:05.357 2 DEBUG nova.network.neutron [req-39a97e4a-1505-46e7-8bb4-05dd89c65673 req-a3e984e9-e5ce-40bb-89a9-ef01a82ea075 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:17:05 compute-0 nova_compute[189265]: 2025-09-30 07:17:05.869 2 DEBUG oslo_concurrency.lockutils [req-39a97e4a-1505-46e7-8bb4-05dd89c65673 req-a3e984e9-e5ce-40bb-89a9-ef01a82ea075 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Releasing lock "refresh_cache-992a0681-bc5e-40b3-adf3-305eee0718fd" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:17:05 compute-0 nova_compute[189265]: 2025-09-30 07:17:05.871 2 DEBUG oslo_concurrency.lockutils [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Acquired lock "refresh_cache-992a0681-bc5e-40b3-adf3-305eee0718fd" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:17:05 compute-0 nova_compute[189265]: 2025-09-30 07:17:05.871 2 DEBUG nova.network.neutron [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 07:17:06 compute-0 nova_compute[189265]: 2025-09-30 07:17:06.816 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:07 compute-0 nova_compute[189265]: 2025-09-30 07:17:07.187 2 DEBUG nova.network.neutron [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 07:17:07 compute-0 nova_compute[189265]: 2025-09-30 07:17:07.426 2 WARNING neutronclient.v2_0.client [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:17:07 compute-0 nova_compute[189265]: 2025-09-30 07:17:07.723 2 DEBUG nova.network.neutron [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Updating instance_info_cache with network_info: [{"id": "a38c613b-6d8b-4dc3-96e4-4a4103b20d91", "address": "fa:16:3e:ec:ab:8a", "network": {"id": "74ffbf65-ebbd-4587-bf5b-0b38421a4813", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1315246804-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1dc2a906d2242f79ffab81c2cf3c4d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa38c613b-6d", "ovs_interfaceid": "a38c613b-6d8b-4dc3-96e4-4a4103b20d91", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:17:08 compute-0 nova_compute[189265]: 2025-09-30 07:17:08.231 2 DEBUG oslo_concurrency.lockutils [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Releasing lock "refresh_cache-992a0681-bc5e-40b3-adf3-305eee0718fd" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:17:08 compute-0 nova_compute[189265]: 2025-09-30 07:17:08.232 2 DEBUG nova.compute.manager [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Instance network_info: |[{"id": "a38c613b-6d8b-4dc3-96e4-4a4103b20d91", "address": "fa:16:3e:ec:ab:8a", "network": {"id": "74ffbf65-ebbd-4587-bf5b-0b38421a4813", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1315246804-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1dc2a906d2242f79ffab81c2cf3c4d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa38c613b-6d", "ovs_interfaceid": "a38c613b-6d8b-4dc3-96e4-4a4103b20d91", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Sep 30 07:17:08 compute-0 nova_compute[189265]: 2025-09-30 07:17:08.237 2 DEBUG nova.virt.libvirt.driver [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Start _get_guest_xml network_info=[{"id": "a38c613b-6d8b-4dc3-96e4-4a4103b20d91", "address": "fa:16:3e:ec:ab:8a", "network": {"id": "74ffbf65-ebbd-4587-bf5b-0b38421a4813", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1315246804-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1dc2a906d2242f79ffab81c2cf3c4d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa38c613b-6d", "ovs_interfaceid": "a38c613b-6d8b-4dc3-96e4-4a4103b20d91", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T07:07:59Z,direct_url=<?>,disk_format='qcow2',id=0c6b92f5-9861-49e4-862d-3ffd84520dfa,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4049964ce8244dacb50493f6676c6613',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T07:08:00Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'encryption_format': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encrypted': False, 'image_id': '0c6b92f5-9861-49e4-862d-3ffd84520dfa'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Sep 30 07:17:08 compute-0 nova_compute[189265]: 2025-09-30 07:17:08.243 2 WARNING nova.virt.libvirt.driver [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:17:08 compute-0 nova_compute[189265]: 2025-09-30 07:17:08.244 2 DEBUG nova.virt.driver [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='0c6b92f5-9861-49e4-862d-3ffd84520dfa', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteActionsViaActuator-server-1531701854', uuid='992a0681-bc5e-40b3-adf3-305eee0718fd'), owner=OwnerMeta(userid='d6cb6be5d6fc407eb3abc1c7c70f5d77', username='tempest-TestExecuteActionsViaActuator-2061885601-project-admin', projectid='1413b21c2db845e58d8a81f524a55f3a', projectname='tempest-TestExecuteActionsViaActuator-2061885601'), image=ImageMeta(id='0c6b92f5-9861-49e4-862d-3ffd84520dfa', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='ded17455-f8fe-40c7-8dae-6f0a2b208ae0', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "a38c613b-6d8b-4dc3-96e4-4a4103b20d91", "address": "fa:16:3e:ec:ab:8a", "network": {"id": "74ffbf65-ebbd-4587-bf5b-0b38421a4813", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1315246804-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1dc2a906d2242f79ffab81c2cf3c4d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa38c613b-6d", "ovs_interfaceid": "a38c613b-6d8b-4dc3-96e4-4a4103b20d91", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759216628.2448802) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Sep 30 07:17:08 compute-0 nova_compute[189265]: 2025-09-30 07:17:08.250 2 DEBUG nova.virt.libvirt.host [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Sep 30 07:17:08 compute-0 nova_compute[189265]: 2025-09-30 07:17:08.251 2 DEBUG nova.virt.libvirt.host [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Sep 30 07:17:08 compute-0 nova_compute[189265]: 2025-09-30 07:17:08.253 2 DEBUG nova.virt.libvirt.host [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Sep 30 07:17:08 compute-0 nova_compute[189265]: 2025-09-30 07:17:08.254 2 DEBUG nova.virt.libvirt.host [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Sep 30 07:17:08 compute-0 nova_compute[189265]: 2025-09-30 07:17:08.254 2 DEBUG nova.virt.libvirt.driver [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 07:17:08 compute-0 nova_compute[189265]: 2025-09-30 07:17:08.254 2 DEBUG nova.virt.hardware [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T07:07:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='ded17455-f8fe-40c7-8dae-6f0a2b208ae0',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T07:07:59Z,direct_url=<?>,disk_format='qcow2',id=0c6b92f5-9861-49e4-862d-3ffd84520dfa,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4049964ce8244dacb50493f6676c6613',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T07:08:00Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Sep 30 07:17:08 compute-0 nova_compute[189265]: 2025-09-30 07:17:08.255 2 DEBUG nova.virt.hardware [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Sep 30 07:17:08 compute-0 nova_compute[189265]: 2025-09-30 07:17:08.255 2 DEBUG nova.virt.hardware [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Sep 30 07:17:08 compute-0 nova_compute[189265]: 2025-09-30 07:17:08.255 2 DEBUG nova.virt.hardware [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Sep 30 07:17:08 compute-0 nova_compute[189265]: 2025-09-30 07:17:08.256 2 DEBUG nova.virt.hardware [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Sep 30 07:17:08 compute-0 nova_compute[189265]: 2025-09-30 07:17:08.256 2 DEBUG nova.virt.hardware [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Sep 30 07:17:08 compute-0 nova_compute[189265]: 2025-09-30 07:17:08.256 2 DEBUG nova.virt.hardware [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Sep 30 07:17:08 compute-0 nova_compute[189265]: 2025-09-30 07:17:08.256 2 DEBUG nova.virt.hardware [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Sep 30 07:17:08 compute-0 nova_compute[189265]: 2025-09-30 07:17:08.256 2 DEBUG nova.virt.hardware [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Sep 30 07:17:08 compute-0 nova_compute[189265]: 2025-09-30 07:17:08.257 2 DEBUG nova.virt.hardware [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Sep 30 07:17:08 compute-0 nova_compute[189265]: 2025-09-30 07:17:08.257 2 DEBUG nova.virt.hardware [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Sep 30 07:17:08 compute-0 nova_compute[189265]: 2025-09-30 07:17:08.260 2 DEBUG nova.virt.libvirt.vif [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T07:16:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1531701854',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1531701854',id=7,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1413b21c2db845e58d8a81f524a55f3a',ramdisk_id='',reservation_id='r-f20tme0f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-2061885601',owner_user_name='tempest-TestExecuteActionsViaActuator-2061885601-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T07:17:01Z,user_data=None,user_id='d6cb6be5d6fc407eb3abc1c7c70f5d77',uuid=992a0681-bc5e-40b3-adf3-305eee0718fd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a38c613b-6d8b-4dc3-96e4-4a4103b20d91", "address": "fa:16:3e:ec:ab:8a", "network": {"id": "74ffbf65-ebbd-4587-bf5b-0b38421a4813", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1315246804-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1dc2a906d2242f79ffab81c2cf3c4d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa38c613b-6d", "ovs_interfaceid": "a38c613b-6d8b-4dc3-96e4-4a4103b20d91", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 07:17:08 compute-0 nova_compute[189265]: 2025-09-30 07:17:08.261 2 DEBUG nova.network.os_vif_util [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Converting VIF {"id": "a38c613b-6d8b-4dc3-96e4-4a4103b20d91", "address": "fa:16:3e:ec:ab:8a", "network": {"id": "74ffbf65-ebbd-4587-bf5b-0b38421a4813", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1315246804-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1dc2a906d2242f79ffab81c2cf3c4d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa38c613b-6d", "ovs_interfaceid": "a38c613b-6d8b-4dc3-96e4-4a4103b20d91", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:17:08 compute-0 nova_compute[189265]: 2025-09-30 07:17:08.261 2 DEBUG nova.network.os_vif_util [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ec:ab:8a,bridge_name='br-int',has_traffic_filtering=True,id=a38c613b-6d8b-4dc3-96e4-4a4103b20d91,network=Network(74ffbf65-ebbd-4587-bf5b-0b38421a4813),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa38c613b-6d') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:17:08 compute-0 nova_compute[189265]: 2025-09-30 07:17:08.262 2 DEBUG nova.objects.instance [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lazy-loading 'pci_devices' on Instance uuid 992a0681-bc5e-40b3-adf3-305eee0718fd obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:17:08 compute-0 nova_compute[189265]: 2025-09-30 07:17:08.772 2 DEBUG nova.virt.libvirt.driver [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] End _get_guest_xml xml=<domain type="kvm">
Sep 30 07:17:08 compute-0 nova_compute[189265]:   <uuid>992a0681-bc5e-40b3-adf3-305eee0718fd</uuid>
Sep 30 07:17:08 compute-0 nova_compute[189265]:   <name>instance-00000007</name>
Sep 30 07:17:08 compute-0 nova_compute[189265]:   <memory>131072</memory>
Sep 30 07:17:08 compute-0 nova_compute[189265]:   <vcpu>1</vcpu>
Sep 30 07:17:08 compute-0 nova_compute[189265]:   <metadata>
Sep 30 07:17:08 compute-0 nova_compute[189265]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 07:17:08 compute-0 nova_compute[189265]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-1531701854</nova:name>
Sep 30 07:17:08 compute-0 nova_compute[189265]:       <nova:creationTime>2025-09-30 07:17:08</nova:creationTime>
Sep 30 07:17:08 compute-0 nova_compute[189265]:       <nova:flavor name="m1.nano" id="ded17455-f8fe-40c7-8dae-6f0a2b208ae0">
Sep 30 07:17:08 compute-0 nova_compute[189265]:         <nova:memory>128</nova:memory>
Sep 30 07:17:08 compute-0 nova_compute[189265]:         <nova:disk>1</nova:disk>
Sep 30 07:17:08 compute-0 nova_compute[189265]:         <nova:swap>0</nova:swap>
Sep 30 07:17:08 compute-0 nova_compute[189265]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 07:17:08 compute-0 nova_compute[189265]:         <nova:vcpus>1</nova:vcpus>
Sep 30 07:17:08 compute-0 nova_compute[189265]:         <nova:extraSpecs>
Sep 30 07:17:08 compute-0 nova_compute[189265]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 07:17:08 compute-0 nova_compute[189265]:         </nova:extraSpecs>
Sep 30 07:17:08 compute-0 nova_compute[189265]:       </nova:flavor>
Sep 30 07:17:08 compute-0 nova_compute[189265]:       <nova:image uuid="0c6b92f5-9861-49e4-862d-3ffd84520dfa">
Sep 30 07:17:08 compute-0 nova_compute[189265]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 07:17:08 compute-0 nova_compute[189265]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 07:17:08 compute-0 nova_compute[189265]:         <nova:minDisk>1</nova:minDisk>
Sep 30 07:17:08 compute-0 nova_compute[189265]:         <nova:minRam>0</nova:minRam>
Sep 30 07:17:08 compute-0 nova_compute[189265]:         <nova:properties>
Sep 30 07:17:08 compute-0 nova_compute[189265]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 07:17:08 compute-0 nova_compute[189265]:         </nova:properties>
Sep 30 07:17:08 compute-0 nova_compute[189265]:       </nova:image>
Sep 30 07:17:08 compute-0 nova_compute[189265]:       <nova:owner>
Sep 30 07:17:08 compute-0 nova_compute[189265]:         <nova:user uuid="d6cb6be5d6fc407eb3abc1c7c70f5d77">tempest-TestExecuteActionsViaActuator-2061885601-project-admin</nova:user>
Sep 30 07:17:08 compute-0 nova_compute[189265]:         <nova:project uuid="1413b21c2db845e58d8a81f524a55f3a">tempest-TestExecuteActionsViaActuator-2061885601</nova:project>
Sep 30 07:17:08 compute-0 nova_compute[189265]:       </nova:owner>
Sep 30 07:17:08 compute-0 nova_compute[189265]:       <nova:root type="image" uuid="0c6b92f5-9861-49e4-862d-3ffd84520dfa"/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:       <nova:ports>
Sep 30 07:17:08 compute-0 nova_compute[189265]:         <nova:port uuid="a38c613b-6d8b-4dc3-96e4-4a4103b20d91">
Sep 30 07:17:08 compute-0 nova_compute[189265]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:         </nova:port>
Sep 30 07:17:08 compute-0 nova_compute[189265]:       </nova:ports>
Sep 30 07:17:08 compute-0 nova_compute[189265]:     </nova:instance>
Sep 30 07:17:08 compute-0 nova_compute[189265]:   </metadata>
Sep 30 07:17:08 compute-0 nova_compute[189265]:   <sysinfo type="smbios">
Sep 30 07:17:08 compute-0 nova_compute[189265]:     <system>
Sep 30 07:17:08 compute-0 nova_compute[189265]:       <entry name="manufacturer">RDO</entry>
Sep 30 07:17:08 compute-0 nova_compute[189265]:       <entry name="product">OpenStack Compute</entry>
Sep 30 07:17:08 compute-0 nova_compute[189265]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 07:17:08 compute-0 nova_compute[189265]:       <entry name="serial">992a0681-bc5e-40b3-adf3-305eee0718fd</entry>
Sep 30 07:17:08 compute-0 nova_compute[189265]:       <entry name="uuid">992a0681-bc5e-40b3-adf3-305eee0718fd</entry>
Sep 30 07:17:08 compute-0 nova_compute[189265]:       <entry name="family">Virtual Machine</entry>
Sep 30 07:17:08 compute-0 nova_compute[189265]:     </system>
Sep 30 07:17:08 compute-0 nova_compute[189265]:   </sysinfo>
Sep 30 07:17:08 compute-0 nova_compute[189265]:   <os>
Sep 30 07:17:08 compute-0 nova_compute[189265]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 07:17:08 compute-0 nova_compute[189265]:     <boot dev="hd"/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:     <smbios mode="sysinfo"/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:   </os>
Sep 30 07:17:08 compute-0 nova_compute[189265]:   <features>
Sep 30 07:17:08 compute-0 nova_compute[189265]:     <acpi/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:     <apic/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:     <vmcoreinfo/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:   </features>
Sep 30 07:17:08 compute-0 nova_compute[189265]:   <clock offset="utc">
Sep 30 07:17:08 compute-0 nova_compute[189265]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:     <timer name="hpet" present="no"/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:   </clock>
Sep 30 07:17:08 compute-0 nova_compute[189265]:   <cpu mode="host-model" match="exact">
Sep 30 07:17:08 compute-0 nova_compute[189265]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:   </cpu>
Sep 30 07:17:08 compute-0 nova_compute[189265]:   <devices>
Sep 30 07:17:08 compute-0 nova_compute[189265]:     <disk type="file" device="disk">
Sep 30 07:17:08 compute-0 nova_compute[189265]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:       <source file="/var/lib/nova/instances/992a0681-bc5e-40b3-adf3-305eee0718fd/disk"/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:       <target dev="vda" bus="virtio"/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:     </disk>
Sep 30 07:17:08 compute-0 nova_compute[189265]:     <disk type="file" device="cdrom">
Sep 30 07:17:08 compute-0 nova_compute[189265]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:       <source file="/var/lib/nova/instances/992a0681-bc5e-40b3-adf3-305eee0718fd/disk.config"/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:       <target dev="sda" bus="sata"/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:     </disk>
Sep 30 07:17:08 compute-0 nova_compute[189265]:     <interface type="ethernet">
Sep 30 07:17:08 compute-0 nova_compute[189265]:       <mac address="fa:16:3e:ec:ab:8a"/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:       <model type="virtio"/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:       <mtu size="1442"/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:       <target dev="tapa38c613b-6d"/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:     </interface>
Sep 30 07:17:08 compute-0 nova_compute[189265]:     <serial type="pty">
Sep 30 07:17:08 compute-0 nova_compute[189265]:       <log file="/var/lib/nova/instances/992a0681-bc5e-40b3-adf3-305eee0718fd/console.log" append="off"/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:     </serial>
Sep 30 07:17:08 compute-0 nova_compute[189265]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:     <video>
Sep 30 07:17:08 compute-0 nova_compute[189265]:       <model type="virtio"/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:     </video>
Sep 30 07:17:08 compute-0 nova_compute[189265]:     <input type="tablet" bus="usb"/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:     <rng model="virtio">
Sep 30 07:17:08 compute-0 nova_compute[189265]:       <backend model="random">/dev/urandom</backend>
Sep 30 07:17:08 compute-0 nova_compute[189265]:     </rng>
Sep 30 07:17:08 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root"/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:     <controller type="usb" index="0"/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 07:17:08 compute-0 nova_compute[189265]:       <stats period="10"/>
Sep 30 07:17:08 compute-0 nova_compute[189265]:     </memballoon>
Sep 30 07:17:08 compute-0 nova_compute[189265]:   </devices>
Sep 30 07:17:08 compute-0 nova_compute[189265]: </domain>
Sep 30 07:17:08 compute-0 nova_compute[189265]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Sep 30 07:17:08 compute-0 nova_compute[189265]: 2025-09-30 07:17:08.774 2 DEBUG nova.compute.manager [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Preparing to wait for external event network-vif-plugged-a38c613b-6d8b-4dc3-96e4-4a4103b20d91 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 07:17:08 compute-0 nova_compute[189265]: 2025-09-30 07:17:08.774 2 DEBUG oslo_concurrency.lockutils [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Acquiring lock "992a0681-bc5e-40b3-adf3-305eee0718fd-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:17:08 compute-0 nova_compute[189265]: 2025-09-30 07:17:08.775 2 DEBUG oslo_concurrency.lockutils [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "992a0681-bc5e-40b3-adf3-305eee0718fd-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:17:08 compute-0 nova_compute[189265]: 2025-09-30 07:17:08.775 2 DEBUG oslo_concurrency.lockutils [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "992a0681-bc5e-40b3-adf3-305eee0718fd-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:17:08 compute-0 nova_compute[189265]: 2025-09-30 07:17:08.776 2 DEBUG nova.virt.libvirt.vif [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T07:16:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1531701854',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1531701854',id=7,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1413b21c2db845e58d8a81f524a55f3a',ramdisk_id='',reservation_id='r-f20tme0f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteActionsViaActuator-2061885601',owner_user_name='tempest-TestExecuteActionsViaActuator-2061885601-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T07:17:01Z,user_data=None,user_id='d6cb6be5d6fc407eb3abc1c7c70f5d77',uuid=992a0681-bc5e-40b3-adf3-305eee0718fd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a38c613b-6d8b-4dc3-96e4-4a4103b20d91", "address": "fa:16:3e:ec:ab:8a", "network": {"id": "74ffbf65-ebbd-4587-bf5b-0b38421a4813", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1315246804-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1dc2a906d2242f79ffab81c2cf3c4d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa38c613b-6d", "ovs_interfaceid": "a38c613b-6d8b-4dc3-96e4-4a4103b20d91", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 07:17:08 compute-0 nova_compute[189265]: 2025-09-30 07:17:08.777 2 DEBUG nova.network.os_vif_util [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Converting VIF {"id": "a38c613b-6d8b-4dc3-96e4-4a4103b20d91", "address": "fa:16:3e:ec:ab:8a", "network": {"id": "74ffbf65-ebbd-4587-bf5b-0b38421a4813", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1315246804-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1dc2a906d2242f79ffab81c2cf3c4d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa38c613b-6d", "ovs_interfaceid": "a38c613b-6d8b-4dc3-96e4-4a4103b20d91", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:17:08 compute-0 nova_compute[189265]: 2025-09-30 07:17:08.778 2 DEBUG nova.network.os_vif_util [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ec:ab:8a,bridge_name='br-int',has_traffic_filtering=True,id=a38c613b-6d8b-4dc3-96e4-4a4103b20d91,network=Network(74ffbf65-ebbd-4587-bf5b-0b38421a4813),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa38c613b-6d') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:17:08 compute-0 nova_compute[189265]: 2025-09-30 07:17:08.778 2 DEBUG os_vif [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ec:ab:8a,bridge_name='br-int',has_traffic_filtering=True,id=a38c613b-6d8b-4dc3-96e4-4a4103b20d91,network=Network(74ffbf65-ebbd-4587-bf5b-0b38421a4813),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa38c613b-6d') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 07:17:08 compute-0 nova_compute[189265]: 2025-09-30 07:17:08.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:08 compute-0 nova_compute[189265]: 2025-09-30 07:17:08.780 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:17:08 compute-0 nova_compute[189265]: 2025-09-30 07:17:08.781 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:17:08 compute-0 nova_compute[189265]: 2025-09-30 07:17:08.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:08 compute-0 nova_compute[189265]: 2025-09-30 07:17:08.782 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '0bfc30da-82ad-50d5-8c3d-06eaca9c7073', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:17:08 compute-0 nova_compute[189265]: 2025-09-30 07:17:08.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:08 compute-0 nova_compute[189265]: 2025-09-30 07:17:08.786 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:08 compute-0 nova_compute[189265]: 2025-09-30 07:17:08.790 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:08 compute-0 nova_compute[189265]: 2025-09-30 07:17:08.790 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa38c613b-6d, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:17:08 compute-0 nova_compute[189265]: 2025-09-30 07:17:08.791 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapa38c613b-6d, col_values=(('qos', UUID('c313f9cc-bcca-4194-b48d-4009c6e3a9e5')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:17:08 compute-0 nova_compute[189265]: 2025-09-30 07:17:08.792 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapa38c613b-6d, col_values=(('external_ids', {'iface-id': 'a38c613b-6d8b-4dc3-96e4-4a4103b20d91', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ec:ab:8a', 'vm-uuid': '992a0681-bc5e-40b3-adf3-305eee0718fd'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:17:08 compute-0 nova_compute[189265]: 2025-09-30 07:17:08.793 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:08 compute-0 NetworkManager[51813]: <info>  [1759216628.7945] manager: (tapa38c613b-6d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/28)
Sep 30 07:17:08 compute-0 nova_compute[189265]: 2025-09-30 07:17:08.796 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:17:08 compute-0 nova_compute[189265]: 2025-09-30 07:17:08.800 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:08 compute-0 nova_compute[189265]: 2025-09-30 07:17:08.802 2 INFO os_vif [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ec:ab:8a,bridge_name='br-int',has_traffic_filtering=True,id=a38c613b-6d8b-4dc3-96e4-4a4103b20d91,network=Network(74ffbf65-ebbd-4587-bf5b-0b38421a4813),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa38c613b-6d')
Sep 30 07:17:10 compute-0 nova_compute[189265]: 2025-09-30 07:17:10.375 2 DEBUG nova.virt.libvirt.driver [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 07:17:10 compute-0 nova_compute[189265]: 2025-09-30 07:17:10.375 2 DEBUG nova.virt.libvirt.driver [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 07:17:10 compute-0 nova_compute[189265]: 2025-09-30 07:17:10.376 2 DEBUG nova.virt.libvirt.driver [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] No VIF found with MAC fa:16:3e:ec:ab:8a, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 07:17:10 compute-0 nova_compute[189265]: 2025-09-30 07:17:10.377 2 INFO nova.virt.libvirt.driver [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Using config drive
Sep 30 07:17:10 compute-0 nova_compute[189265]: 2025-09-30 07:17:10.901 2 WARNING neutronclient.v2_0.client [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:17:11 compute-0 nova_compute[189265]: 2025-09-30 07:17:11.170 2 INFO nova.virt.libvirt.driver [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Creating config drive at /var/lib/nova/instances/992a0681-bc5e-40b3-adf3-305eee0718fd/disk.config
Sep 30 07:17:11 compute-0 nova_compute[189265]: 2025-09-30 07:17:11.175 2 DEBUG oslo_concurrency.processutils [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/992a0681-bc5e-40b3-adf3-305eee0718fd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmp4yg288n9 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:17:11 compute-0 nova_compute[189265]: 2025-09-30 07:17:11.312 2 DEBUG oslo_concurrency.processutils [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/992a0681-bc5e-40b3-adf3-305eee0718fd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmp4yg288n9" returned: 0 in 0.136s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:17:11 compute-0 kernel: tapa38c613b-6d: entered promiscuous mode
Sep 30 07:17:11 compute-0 NetworkManager[51813]: <info>  [1759216631.4112] manager: (tapa38c613b-6d): new Tun device (/org/freedesktop/NetworkManager/Devices/29)
Sep 30 07:17:11 compute-0 ovn_controller[91436]: 2025-09-30T07:17:11Z|00050|binding|INFO|Claiming lport a38c613b-6d8b-4dc3-96e4-4a4103b20d91 for this chassis.
Sep 30 07:17:11 compute-0 ovn_controller[91436]: 2025-09-30T07:17:11Z|00051|binding|INFO|a38c613b-6d8b-4dc3-96e4-4a4103b20d91: Claiming fa:16:3e:ec:ab:8a 10.100.0.7
Sep 30 07:17:11 compute-0 nova_compute[189265]: 2025-09-30 07:17:11.415 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:11 compute-0 ovn_controller[91436]: 2025-09-30T07:17:11Z|00052|binding|INFO|Setting lport a38c613b-6d8b-4dc3-96e4-4a4103b20d91 ovn-installed in OVS
Sep 30 07:17:11 compute-0 nova_compute[189265]: 2025-09-30 07:17:11.442 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:11 compute-0 nova_compute[189265]: 2025-09-30 07:17:11.446 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:11 compute-0 ovn_controller[91436]: 2025-09-30T07:17:11Z|00053|binding|INFO|Setting lport a38c613b-6d8b-4dc3-96e4-4a4103b20d91 up in Southbound
Sep 30 07:17:11 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:17:11.454 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ec:ab:8a 10.100.0.7'], port_security=['fa:16:3e:ec:ab:8a 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '992a0681-bc5e-40b3-adf3-305eee0718fd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74ffbf65-ebbd-4587-bf5b-0b38421a4813', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1413b21c2db845e58d8a81f524a55f3a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8ad3c6f6-3842-4d69-92ac-cef07b75c3bc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7b541691-433c-426c-b8b7-10d79319603a, chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], tunnel_key=8, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], logical_port=a38c613b-6d8b-4dc3-96e4-4a4103b20d91) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:17:11 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:17:11.455 100322 INFO neutron.agent.ovn.metadata.agent [-] Port a38c613b-6d8b-4dc3-96e4-4a4103b20d91 in datapath 74ffbf65-ebbd-4587-bf5b-0b38421a4813 bound to our chassis
Sep 30 07:17:11 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:17:11.460 100322 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 74ffbf65-ebbd-4587-bf5b-0b38421a4813
Sep 30 07:17:11 compute-0 systemd-udevd[213541]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 07:17:11 compute-0 systemd-machined[149233]: New machine qemu-3-instance-00000007.
Sep 30 07:17:11 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:17:11.486 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[aea00a78-29b4-4d01-822e-1267db4c2bbb]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:17:11 compute-0 NetworkManager[51813]: <info>  [1759216631.4915] device (tapa38c613b-6d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 07:17:11 compute-0 NetworkManager[51813]: <info>  [1759216631.4928] device (tapa38c613b-6d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 07:17:11 compute-0 systemd[1]: Started Virtual Machine qemu-3-instance-00000007.
Sep 30 07:17:11 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:17:11.534 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[35d9d419-6e58-405c-9eca-df8ab687b544]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:17:11 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:17:11.537 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[0134ae20-3aa5-4cab-b3f4-45dfec95db64]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:17:11 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:17:11.570 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[163d6ede-27f6-4a40-bf38-6de0bfbf1faf]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:17:11 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:17:11.598 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[d4c3ac9e-13ff-4da2-84e1-0d337ebeb4cc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap74ffbf65-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1f:ef:af'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 7, 'rx_bytes': 1000, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 7, 'rx_bytes': 1000, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 434702, 'reachable_time': 35230, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213555, 'error': None, 'target': 'ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:17:11 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:17:11.623 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[fc4d5d74-da6e-4cdb-9c3b-a023238f7755]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap74ffbf65-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 434715, 'tstamp': 434715}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213557, 'error': None, 'target': 'ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap74ffbf65-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 434718, 'tstamp': 434718}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213557, 'error': None, 'target': 'ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:17:11 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:17:11.626 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap74ffbf65-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:17:11 compute-0 nova_compute[189265]: 2025-09-30 07:17:11.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:11 compute-0 nova_compute[189265]: 2025-09-30 07:17:11.661 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:11 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:17:11.661 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap74ffbf65-e0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:17:11 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:17:11.662 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:17:11 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:17:11.662 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap74ffbf65-e0, col_values=(('external_ids', {'iface-id': '0c700e20-e593-4a77-93d7-fc919dc1f294'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:17:11 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:17:11.663 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:17:11 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:17:11.664 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[d8b54de3-f25b-4ef5-8543-25c2108cbb40]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-74ffbf65-ebbd-4587-bf5b-0b38421a4813\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/74ffbf65-ebbd-4587-bf5b-0b38421a4813.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 74ffbf65-ebbd-4587-bf5b-0b38421a4813\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:17:11 compute-0 nova_compute[189265]: 2025-09-30 07:17:11.819 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:11 compute-0 nova_compute[189265]: 2025-09-30 07:17:11.912 2 DEBUG nova.compute.manager [req-8fd30721-5766-433a-89aa-2aaaf3e68a51 req-53bb9743-eb47-4bdd-a1b5-1e5aeba57ee5 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Received event network-vif-plugged-a38c613b-6d8b-4dc3-96e4-4a4103b20d91 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:17:11 compute-0 nova_compute[189265]: 2025-09-30 07:17:11.913 2 DEBUG oslo_concurrency.lockutils [req-8fd30721-5766-433a-89aa-2aaaf3e68a51 req-53bb9743-eb47-4bdd-a1b5-1e5aeba57ee5 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "992a0681-bc5e-40b3-adf3-305eee0718fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:17:11 compute-0 nova_compute[189265]: 2025-09-30 07:17:11.913 2 DEBUG oslo_concurrency.lockutils [req-8fd30721-5766-433a-89aa-2aaaf3e68a51 req-53bb9743-eb47-4bdd-a1b5-1e5aeba57ee5 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "992a0681-bc5e-40b3-adf3-305eee0718fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:17:11 compute-0 nova_compute[189265]: 2025-09-30 07:17:11.913 2 DEBUG oslo_concurrency.lockutils [req-8fd30721-5766-433a-89aa-2aaaf3e68a51 req-53bb9743-eb47-4bdd-a1b5-1e5aeba57ee5 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "992a0681-bc5e-40b3-adf3-305eee0718fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:17:11 compute-0 nova_compute[189265]: 2025-09-30 07:17:11.913 2 DEBUG nova.compute.manager [req-8fd30721-5766-433a-89aa-2aaaf3e68a51 req-53bb9743-eb47-4bdd-a1b5-1e5aeba57ee5 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Processing event network-vif-plugged-a38c613b-6d8b-4dc3-96e4-4a4103b20d91 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 07:17:12 compute-0 podman[213567]: 2025-09-30 07:17:12.507162917 +0000 UTC m=+0.077321271 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=iscsid)
Sep 30 07:17:12 compute-0 unix_chkpwd[213589]: password check failed for user (root)
Sep 30 07:17:12 compute-0 sshd-session[213558]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.33  user=root
Sep 30 07:17:12 compute-0 nova_compute[189265]: 2025-09-30 07:17:12.754 2 DEBUG nova.compute.manager [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 07:17:12 compute-0 nova_compute[189265]: 2025-09-30 07:17:12.765 2 DEBUG nova.virt.libvirt.driver [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Sep 30 07:17:12 compute-0 nova_compute[189265]: 2025-09-30 07:17:12.768 2 INFO nova.virt.libvirt.driver [-] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Instance spawned successfully.
Sep 30 07:17:12 compute-0 nova_compute[189265]: 2025-09-30 07:17:12.768 2 DEBUG nova.virt.libvirt.driver [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Sep 30 07:17:13 compute-0 nova_compute[189265]: 2025-09-30 07:17:13.322 2 DEBUG nova.virt.libvirt.driver [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:17:13 compute-0 nova_compute[189265]: 2025-09-30 07:17:13.323 2 DEBUG nova.virt.libvirt.driver [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:17:13 compute-0 nova_compute[189265]: 2025-09-30 07:17:13.324 2 DEBUG nova.virt.libvirt.driver [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:17:13 compute-0 nova_compute[189265]: 2025-09-30 07:17:13.324 2 DEBUG nova.virt.libvirt.driver [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:17:13 compute-0 nova_compute[189265]: 2025-09-30 07:17:13.325 2 DEBUG nova.virt.libvirt.driver [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:17:13 compute-0 nova_compute[189265]: 2025-09-30 07:17:13.326 2 DEBUG nova.virt.libvirt.driver [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:17:13 compute-0 nova_compute[189265]: 2025-09-30 07:17:13.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:13 compute-0 nova_compute[189265]: 2025-09-30 07:17:13.868 2 INFO nova.compute.manager [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Took 11.16 seconds to spawn the instance on the hypervisor.
Sep 30 07:17:13 compute-0 nova_compute[189265]: 2025-09-30 07:17:13.869 2 DEBUG nova.compute.manager [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 07:17:14 compute-0 nova_compute[189265]: 2025-09-30 07:17:14.498 2 INFO nova.compute.manager [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Took 16.53 seconds to build instance.
Sep 30 07:17:14 compute-0 sshd-session[213558]: Failed password for root from 193.46.255.33 port 31732 ssh2
Sep 30 07:17:14 compute-0 nova_compute[189265]: 2025-09-30 07:17:14.620 2 DEBUG nova.compute.manager [req-462af2be-eddb-4da2-ac9e-d3cac84127f8 req-695ff238-a18d-46e1-b767-63791ce65fb3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Received event network-vif-plugged-a38c613b-6d8b-4dc3-96e4-4a4103b20d91 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:17:14 compute-0 nova_compute[189265]: 2025-09-30 07:17:14.620 2 DEBUG oslo_concurrency.lockutils [req-462af2be-eddb-4da2-ac9e-d3cac84127f8 req-695ff238-a18d-46e1-b767-63791ce65fb3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "992a0681-bc5e-40b3-adf3-305eee0718fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:17:14 compute-0 nova_compute[189265]: 2025-09-30 07:17:14.620 2 DEBUG oslo_concurrency.lockutils [req-462af2be-eddb-4da2-ac9e-d3cac84127f8 req-695ff238-a18d-46e1-b767-63791ce65fb3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "992a0681-bc5e-40b3-adf3-305eee0718fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:17:14 compute-0 nova_compute[189265]: 2025-09-30 07:17:14.621 2 DEBUG oslo_concurrency.lockutils [req-462af2be-eddb-4da2-ac9e-d3cac84127f8 req-695ff238-a18d-46e1-b767-63791ce65fb3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "992a0681-bc5e-40b3-adf3-305eee0718fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:17:14 compute-0 nova_compute[189265]: 2025-09-30 07:17:14.621 2 DEBUG nova.compute.manager [req-462af2be-eddb-4da2-ac9e-d3cac84127f8 req-695ff238-a18d-46e1-b767-63791ce65fb3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] No waiting events found dispatching network-vif-plugged-a38c613b-6d8b-4dc3-96e4-4a4103b20d91 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:17:14 compute-0 nova_compute[189265]: 2025-09-30 07:17:14.621 2 WARNING nova.compute.manager [req-462af2be-eddb-4da2-ac9e-d3cac84127f8 req-695ff238-a18d-46e1-b767-63791ce65fb3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Received unexpected event network-vif-plugged-a38c613b-6d8b-4dc3-96e4-4a4103b20d91 for instance with vm_state active and task_state None.
Sep 30 07:17:14 compute-0 unix_chkpwd[213591]: password check failed for user (root)
Sep 30 07:17:15 compute-0 nova_compute[189265]: 2025-09-30 07:17:15.007 2 DEBUG oslo_concurrency.lockutils [None req-9de944a4-d563-4930-ac5e-b4e0688a6d89 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "992a0681-bc5e-40b3-adf3-305eee0718fd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.061s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:17:15 compute-0 sshd-session[213590]: Invalid user test from 185.156.73.233 port 35740
Sep 30 07:17:15 compute-0 podman[213593]: 2025-09-30 07:17:15.727194557 +0000 UTC m=+0.071065861 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, release=1755695350, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_id=edpm, version=9.6, architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Sep 30 07:17:15 compute-0 sshd-session[213590]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:17:15 compute-0 sshd-session[213590]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=185.156.73.233
Sep 30 07:17:16 compute-0 sshd-session[213558]: Failed password for root from 193.46.255.33 port 31732 ssh2
Sep 30 07:17:16 compute-0 nova_compute[189265]: 2025-09-30 07:17:16.821 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:16 compute-0 unix_chkpwd[213614]: password check failed for user (root)
Sep 30 07:17:17 compute-0 sshd-session[213590]: Failed password for invalid user test from 185.156.73.233 port 35740 ssh2
Sep 30 07:17:18 compute-0 sshd-session[213558]: Failed password for root from 193.46.255.33 port 31732 ssh2
Sep 30 07:17:18 compute-0 podman[213615]: 2025-09-30 07:17:18.497592511 +0000 UTC m=+0.073881182 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Sep 30 07:17:18 compute-0 podman[213616]: 2025-09-30 07:17:18.568097134 +0000 UTC m=+0.142410368 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2)
Sep 30 07:17:18 compute-0 nova_compute[189265]: 2025-09-30 07:17:18.835 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:19 compute-0 sshd-session[213590]: Connection closed by invalid user test 185.156.73.233 port 35740 [preauth]
Sep 30 07:17:19 compute-0 sshd-session[213558]: Received disconnect from 193.46.255.33 port 31732:11:  [preauth]
Sep 30 07:17:19 compute-0 sshd-session[213558]: Disconnected from authenticating user root 193.46.255.33 port 31732 [preauth]
Sep 30 07:17:19 compute-0 sshd-session[213558]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.33  user=root
Sep 30 07:17:19 compute-0 unix_chkpwd[213664]: password check failed for user (root)
Sep 30 07:17:19 compute-0 sshd-session[213662]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.33  user=root
Sep 30 07:17:20 compute-0 podman[213665]: 2025-09-30 07:17:20.482518517 +0000 UTC m=+0.061265358 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 07:17:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:17:20.541 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:17:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:17:20.541 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:17:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:17:20.542 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:17:21 compute-0 nova_compute[189265]: 2025-09-30 07:17:21.822 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:22 compute-0 sshd-session[213662]: Failed password for root from 193.46.255.33 port 61076 ssh2
Sep 30 07:17:23 compute-0 nova_compute[189265]: 2025-09-30 07:17:23.840 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:24 compute-0 unix_chkpwd[213704]: password check failed for user (root)
Sep 30 07:17:24 compute-0 ovn_controller[91436]: 2025-09-30T07:17:24Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ec:ab:8a 10.100.0.7
Sep 30 07:17:24 compute-0 ovn_controller[91436]: 2025-09-30T07:17:24Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ec:ab:8a 10.100.0.7
Sep 30 07:17:25 compute-0 sshd-session[213662]: Failed password for root from 193.46.255.33 port 61076 ssh2
Sep 30 07:17:26 compute-0 unix_chkpwd[213705]: password check failed for user (root)
Sep 30 07:17:26 compute-0 nova_compute[189265]: 2025-09-30 07:17:26.825 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:27 compute-0 nova_compute[189265]: 2025-09-30 07:17:27.795 2 DEBUG nova.compute.manager [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] Stashing vm_state: active _prep_resize /usr/lib/python3.12/site-packages/nova/compute/manager.py:6169
Sep 30 07:17:28 compute-0 sshd-session[213662]: Failed password for root from 193.46.255.33 port 61076 ssh2
Sep 30 07:17:28 compute-0 nova_compute[189265]: 2025-09-30 07:17:28.364 2 DEBUG oslo_concurrency.lockutils [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:17:28 compute-0 nova_compute[189265]: 2025-09-30 07:17:28.365 2 DEBUG oslo_concurrency.lockutils [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:17:28 compute-0 nova_compute[189265]: 2025-09-30 07:17:28.842 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:28 compute-0 nova_compute[189265]: 2025-09-30 07:17:28.880 2 DEBUG nova.objects.instance [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lazy-loading 'pci_requests' on Instance uuid 3ad1a338-1146-48fa-a1fb-579e9b577b6c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:17:29 compute-0 nova_compute[189265]: 2025-09-30 07:17:29.055 2 DEBUG nova.virt.libvirt.driver [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: a62dd947-c757-461c-9dd7-2ccd8c8daf8c] Creating tmpfile /var/lib/nova/instances/tmpsgrlhdd3 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Sep 30 07:17:29 compute-0 nova_compute[189265]: 2025-09-30 07:17:29.057 2 WARNING neutronclient.v2_0.client [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:17:29 compute-0 nova_compute[189265]: 2025-09-30 07:17:29.149 2 DEBUG nova.compute.manager [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpsgrlhdd3',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Sep 30 07:17:29 compute-0 nova_compute[189265]: 2025-09-30 07:17:29.174 2 DEBUG oslo_concurrency.lockutils [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:17:29 compute-0 nova_compute[189265]: 2025-09-30 07:17:29.174 2 DEBUG oslo_concurrency.lockutils [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:17:29 compute-0 nova_compute[189265]: 2025-09-30 07:17:29.393 2 DEBUG nova.virt.hardware [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Sep 30 07:17:29 compute-0 nova_compute[189265]: 2025-09-30 07:17:29.393 2 INFO nova.compute.claims [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] Claim successful on node compute-0.ctlplane.example.com
Sep 30 07:17:29 compute-0 nova_compute[189265]: 2025-09-30 07:17:29.394 2 DEBUG nova.objects.instance [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lazy-loading 'resources' on Instance uuid 3ad1a338-1146-48fa-a1fb-579e9b577b6c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:17:29 compute-0 nova_compute[189265]: 2025-09-30 07:17:29.687 2 INFO nova.compute.rpcapi [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Automatically selected compute RPC version 6.4 from minimum service version 70
Sep 30 07:17:29 compute-0 nova_compute[189265]: 2025-09-30 07:17:29.688 2 DEBUG oslo_concurrency.lockutils [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:17:29 compute-0 podman[199733]: time="2025-09-30T07:17:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:17:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:17:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 07:17:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:17:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3456 "" "Go-http-client/1.1"
Sep 30 07:17:29 compute-0 nova_compute[189265]: 2025-09-30 07:17:29.901 2 DEBUG nova.objects.base [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Object Instance<3ad1a338-1146-48fa-a1fb-579e9b577b6c> lazy-loaded attributes: pci_requests,resources wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Sep 30 07:17:29 compute-0 nova_compute[189265]: 2025-09-30 07:17:29.901 2 DEBUG nova.objects.instance [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lazy-loading 'numa_topology' on Instance uuid 3ad1a338-1146-48fa-a1fb-579e9b577b6c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:17:30 compute-0 sshd-session[213662]: Received disconnect from 193.46.255.33 port 61076:11:  [preauth]
Sep 30 07:17:30 compute-0 sshd-session[213662]: Disconnected from authenticating user root 193.46.255.33 port 61076 [preauth]
Sep 30 07:17:30 compute-0 sshd-session[213662]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.33  user=root
Sep 30 07:17:30 compute-0 nova_compute[189265]: 2025-09-30 07:17:30.421 2 DEBUG nova.objects.base [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Object Instance<3ad1a338-1146-48fa-a1fb-579e9b577b6c> lazy-loaded attributes: pci_requests,resources,numa_topology wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Sep 30 07:17:30 compute-0 nova_compute[189265]: 2025-09-30 07:17:30.421 2 DEBUG nova.objects.instance [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lazy-loading 'pci_devices' on Instance uuid 3ad1a338-1146-48fa-a1fb-579e9b577b6c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:17:30 compute-0 nova_compute[189265]: 2025-09-30 07:17:30.783 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:17:30 compute-0 nova_compute[189265]: 2025-09-30 07:17:30.928 2 DEBUG nova.objects.base [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Object Instance<3ad1a338-1146-48fa-a1fb-579e9b577b6c> lazy-loaded attributes: pci_requests,resources,numa_topology,pci_devices wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Sep 30 07:17:31 compute-0 unix_chkpwd[213709]: password check failed for user (root)
Sep 30 07:17:31 compute-0 sshd-session[213707]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.33  user=root
Sep 30 07:17:31 compute-0 openstack_network_exporter[201859]: ERROR   07:17:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:17:31 compute-0 openstack_network_exporter[201859]: ERROR   07:17:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:17:31 compute-0 openstack_network_exporter[201859]: ERROR   07:17:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:17:31 compute-0 openstack_network_exporter[201859]: ERROR   07:17:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:17:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:17:31 compute-0 openstack_network_exporter[201859]: ERROR   07:17:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:17:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:17:31 compute-0 nova_compute[189265]: 2025-09-30 07:17:31.466 2 INFO nova.compute.resource_tracker [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] Updating resource usage from migration 27698052-06fc-4032-8e09-3a1784ec1e1c
Sep 30 07:17:31 compute-0 nova_compute[189265]: 2025-09-30 07:17:31.467 2 DEBUG nova.compute.resource_tracker [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] Starting to track incoming migration 27698052-06fc-4032-8e09-3a1784ec1e1c with flavor ded17455-f8fe-40c7-8dae-6f0a2b208ae0 _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Sep 30 07:17:31 compute-0 nova_compute[189265]: 2025-09-30 07:17:31.728 2 WARNING neutronclient.v2_0.client [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:17:31 compute-0 nova_compute[189265]: 2025-09-30 07:17:31.828 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:32 compute-0 nova_compute[189265]: 2025-09-30 07:17:32.224 2 DEBUG nova.compute.provider_tree [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:17:32 compute-0 nova_compute[189265]: 2025-09-30 07:17:32.733 2 DEBUG nova.scheduler.client.report [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:17:32 compute-0 nova_compute[189265]: 2025-09-30 07:17:32.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:17:33 compute-0 nova_compute[189265]: 2025-09-30 07:17:33.250 2 DEBUG oslo_concurrency.lockutils [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 4.885s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:17:33 compute-0 nova_compute[189265]: 2025-09-30 07:17:33.250 2 INFO nova.compute.manager [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] Migrating
Sep 30 07:17:33 compute-0 sshd-session[213707]: Failed password for root from 193.46.255.33 port 21952 ssh2
Sep 30 07:17:33 compute-0 podman[213710]: 2025-09-30 07:17:33.492704414 +0000 UTC m=+0.077267672 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 07:17:33 compute-0 nova_compute[189265]: 2025-09-30 07:17:33.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:17:33 compute-0 nova_compute[189265]: 2025-09-30 07:17:33.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:34 compute-0 nova_compute[189265]: 2025-09-30 07:17:34.784 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:17:35 compute-0 nova_compute[189265]: 2025-09-30 07:17:35.313 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:17:35 compute-0 unix_chkpwd[213735]: password check failed for user (root)
Sep 30 07:17:35 compute-0 nova_compute[189265]: 2025-09-30 07:17:35.654 2 DEBUG nova.compute.manager [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpsgrlhdd3',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='a62dd947-c757-461c-9dd7-2ccd8c8daf8c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Sep 30 07:17:35 compute-0 nova_compute[189265]: 2025-09-30 07:17:35.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:17:35 compute-0 nova_compute[189265]: 2025-09-30 07:17:35.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:17:35 compute-0 nova_compute[189265]: 2025-09-30 07:17:35.788 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:17:35 compute-0 nova_compute[189265]: 2025-09-30 07:17:35.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:17:36 compute-0 nova_compute[189265]: 2025-09-30 07:17:36.364 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:17:36 compute-0 nova_compute[189265]: 2025-09-30 07:17:36.365 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:17:36 compute-0 nova_compute[189265]: 2025-09-30 07:17:36.365 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:17:36 compute-0 nova_compute[189265]: 2025-09-30 07:17:36.365 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:17:36 compute-0 nova_compute[189265]: 2025-09-30 07:17:36.794 2 DEBUG oslo_concurrency.lockutils [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "refresh_cache-a62dd947-c757-461c-9dd7-2ccd8c8daf8c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:17:36 compute-0 nova_compute[189265]: 2025-09-30 07:17:36.794 2 DEBUG oslo_concurrency.lockutils [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquired lock "refresh_cache-a62dd947-c757-461c-9dd7-2ccd8c8daf8c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:17:36 compute-0 nova_compute[189265]: 2025-09-30 07:17:36.794 2 DEBUG nova.network.neutron [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: a62dd947-c757-461c-9dd7-2ccd8c8daf8c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 07:17:36 compute-0 nova_compute[189265]: 2025-09-30 07:17:36.830 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:37 compute-0 sshd-session[213707]: Failed password for root from 193.46.255.33 port 21952 ssh2
Sep 30 07:17:37 compute-0 nova_compute[189265]: 2025-09-30 07:17:37.413 2 WARNING neutronclient.v2_0.client [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:17:37 compute-0 unix_chkpwd[213736]: password check failed for user (root)
Sep 30 07:17:37 compute-0 nova_compute[189265]: 2025-09-30 07:17:37.584 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d40a0fba-a20e-4dcf-a048-10d9e21c6cf6/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:17:37 compute-0 nova_compute[189265]: 2025-09-30 07:17:37.652 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d40a0fba-a20e-4dcf-a048-10d9e21c6cf6/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:17:37 compute-0 nova_compute[189265]: 2025-09-30 07:17:37.653 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d40a0fba-a20e-4dcf-a048-10d9e21c6cf6/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:17:37 compute-0 nova_compute[189265]: 2025-09-30 07:17:37.736 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d40a0fba-a20e-4dcf-a048-10d9e21c6cf6/disk --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:17:37 compute-0 nova_compute[189265]: 2025-09-30 07:17:37.742 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9fa193fb-a398-4552-85b4-a346dffcf697/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:17:37 compute-0 nova_compute[189265]: 2025-09-30 07:17:37.791 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9fa193fb-a398-4552-85b4-a346dffcf697/disk --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:17:37 compute-0 nova_compute[189265]: 2025-09-30 07:17:37.792 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9fa193fb-a398-4552-85b4-a346dffcf697/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:17:37 compute-0 nova_compute[189265]: 2025-09-30 07:17:37.812 2 WARNING neutronclient.v2_0.client [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:17:37 compute-0 nova_compute[189265]: 2025-09-30 07:17:37.845 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9fa193fb-a398-4552-85b4-a346dffcf697/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:17:37 compute-0 nova_compute[189265]: 2025-09-30 07:17:37.856 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/992a0681-bc5e-40b3-adf3-305eee0718fd/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:17:37 compute-0 nova_compute[189265]: 2025-09-30 07:17:37.925 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/992a0681-bc5e-40b3-adf3-305eee0718fd/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:17:37 compute-0 nova_compute[189265]: 2025-09-30 07:17:37.926 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/992a0681-bc5e-40b3-adf3-305eee0718fd/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:17:37 compute-0 sshd-session[213749]: Accepted publickey for nova from 192.168.122.101 port 43118 ssh2: ECDSA SHA256:kKBDNgxy0w2UAT9K/oU+qJBoJqb+wdTtNjxg0tFZ484
Sep 30 07:17:37 compute-0 systemd[1]: Created slice User Slice of UID 42436.
Sep 30 07:17:37 compute-0 systemd[1]: Starting User Runtime Directory /run/user/42436...
Sep 30 07:17:37 compute-0 systemd-logind[824]: New session 29 of user nova.
Sep 30 07:17:38 compute-0 systemd[1]: Finished User Runtime Directory /run/user/42436.
Sep 30 07:17:38 compute-0 nova_compute[189265]: 2025-09-30 07:17:38.011 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/992a0681-bc5e-40b3-adf3-305eee0718fd/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:17:38 compute-0 systemd[1]: Starting User Manager for UID 42436...
Sep 30 07:17:38 compute-0 nova_compute[189265]: 2025-09-30 07:17:38.025 2 DEBUG nova.network.neutron [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: a62dd947-c757-461c-9dd7-2ccd8c8daf8c] Updating instance_info_cache with network_info: [{"id": "50e9f0fc-d5c3-4230-aea5-ef47736ac58f", "address": "fa:16:3e:d0:55:ec", "network": {"id": "74ffbf65-ebbd-4587-bf5b-0b38421a4813", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1315246804-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1dc2a906d2242f79ffab81c2cf3c4d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50e9f0fc-d5", "ovs_interfaceid": "50e9f0fc-d5c3-4230-aea5-ef47736ac58f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:17:38 compute-0 systemd[213757]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 07:17:38 compute-0 systemd[213757]: Queued start job for default target Main User Target.
Sep 30 07:17:38 compute-0 nova_compute[189265]: 2025-09-30 07:17:38.165 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:17:38 compute-0 nova_compute[189265]: 2025-09-30 07:17:38.166 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:17:38 compute-0 nova_compute[189265]: 2025-09-30 07:17:38.181 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.016s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:17:38 compute-0 nova_compute[189265]: 2025-09-30 07:17:38.182 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5401MB free_disk=73.22203826904297GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:17:38 compute-0 nova_compute[189265]: 2025-09-30 07:17:38.182 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:17:38 compute-0 nova_compute[189265]: 2025-09-30 07:17:38.183 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:17:38 compute-0 systemd[213757]: Created slice User Application Slice.
Sep 30 07:17:38 compute-0 systemd[213757]: Started Mark boot as successful after the user session has run 2 minutes.
Sep 30 07:17:38 compute-0 systemd[213757]: Started Daily Cleanup of User's Temporary Directories.
Sep 30 07:17:38 compute-0 systemd[213757]: Reached target Paths.
Sep 30 07:17:38 compute-0 systemd[213757]: Reached target Timers.
Sep 30 07:17:38 compute-0 systemd[213757]: Starting D-Bus User Message Bus Socket...
Sep 30 07:17:38 compute-0 systemd[213757]: Starting Create User's Volatile Files and Directories...
Sep 30 07:17:38 compute-0 systemd[213757]: Finished Create User's Volatile Files and Directories.
Sep 30 07:17:38 compute-0 systemd[213757]: Listening on D-Bus User Message Bus Socket.
Sep 30 07:17:38 compute-0 systemd[213757]: Reached target Sockets.
Sep 30 07:17:38 compute-0 systemd[213757]: Reached target Basic System.
Sep 30 07:17:38 compute-0 systemd[213757]: Reached target Main User Target.
Sep 30 07:17:38 compute-0 systemd[213757]: Startup finished in 152ms.
Sep 30 07:17:38 compute-0 systemd[1]: Started User Manager for UID 42436.
Sep 30 07:17:38 compute-0 systemd[1]: Started Session 29 of User nova.
Sep 30 07:17:38 compute-0 sshd-session[213749]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 07:17:38 compute-0 sshd-session[213775]: Received disconnect from 192.168.122.101 port 43118:11: disconnected by user
Sep 30 07:17:38 compute-0 sshd-session[213775]: Disconnected from user nova 192.168.122.101 port 43118
Sep 30 07:17:38 compute-0 sshd-session[213749]: pam_unix(sshd:session): session closed for user nova
Sep 30 07:17:38 compute-0 systemd[1]: session-29.scope: Deactivated successfully.
Sep 30 07:17:38 compute-0 systemd-logind[824]: Session 29 logged out. Waiting for processes to exit.
Sep 30 07:17:38 compute-0 systemd-logind[824]: Removed session 29.
Sep 30 07:17:38 compute-0 sshd-session[213777]: Accepted publickey for nova from 192.168.122.101 port 43134 ssh2: ECDSA SHA256:kKBDNgxy0w2UAT9K/oU+qJBoJqb+wdTtNjxg0tFZ484
Sep 30 07:17:38 compute-0 systemd-logind[824]: New session 31 of user nova.
Sep 30 07:17:38 compute-0 systemd[1]: Started Session 31 of User nova.
Sep 30 07:17:38 compute-0 sshd-session[213777]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 07:17:38 compute-0 sshd-session[213780]: Received disconnect from 192.168.122.101 port 43134:11: disconnected by user
Sep 30 07:17:38 compute-0 sshd-session[213780]: Disconnected from user nova 192.168.122.101 port 43134
Sep 30 07:17:38 compute-0 sshd-session[213777]: pam_unix(sshd:session): session closed for user nova
Sep 30 07:17:38 compute-0 systemd[1]: session-31.scope: Deactivated successfully.
Sep 30 07:17:38 compute-0 systemd-logind[824]: Session 31 logged out. Waiting for processes to exit.
Sep 30 07:17:38 compute-0 systemd-logind[824]: Removed session 31.
Sep 30 07:17:38 compute-0 nova_compute[189265]: 2025-09-30 07:17:38.666 2 DEBUG oslo_concurrency.lockutils [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Releasing lock "refresh_cache-a62dd947-c757-461c-9dd7-2ccd8c8daf8c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:17:38 compute-0 nova_compute[189265]: 2025-09-30 07:17:38.721 2 DEBUG nova.virt.libvirt.driver [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: a62dd947-c757-461c-9dd7-2ccd8c8daf8c] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpsgrlhdd3',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='a62dd947-c757-461c-9dd7-2ccd8c8daf8c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Sep 30 07:17:38 compute-0 nova_compute[189265]: 2025-09-30 07:17:38.721 2 DEBUG nova.virt.libvirt.driver [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: a62dd947-c757-461c-9dd7-2ccd8c8daf8c] Creating instance directory: /var/lib/nova/instances/a62dd947-c757-461c-9dd7-2ccd8c8daf8c pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Sep 30 07:17:38 compute-0 nova_compute[189265]: 2025-09-30 07:17:38.722 2 DEBUG nova.virt.libvirt.driver [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: a62dd947-c757-461c-9dd7-2ccd8c8daf8c] Creating disk.info with the contents: {'/var/lib/nova/instances/a62dd947-c757-461c-9dd7-2ccd8c8daf8c/disk': 'qcow2', '/var/lib/nova/instances/a62dd947-c757-461c-9dd7-2ccd8c8daf8c/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Sep 30 07:17:38 compute-0 nova_compute[189265]: 2025-09-30 07:17:38.722 2 DEBUG nova.virt.libvirt.driver [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: a62dd947-c757-461c-9dd7-2ccd8c8daf8c] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Sep 30 07:17:38 compute-0 nova_compute[189265]: 2025-09-30 07:17:38.723 2 DEBUG nova.objects.instance [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lazy-loading 'trusted_certs' on Instance uuid a62dd947-c757-461c-9dd7-2ccd8c8daf8c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:17:38 compute-0 nova_compute[189265]: 2025-09-30 07:17:38.901 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:39 compute-0 nova_compute[189265]: 2025-09-30 07:17:39.250 2 DEBUG oslo_utils.imageutils.format_inspector [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:17:39 compute-0 nova_compute[189265]: 2025-09-30 07:17:39.256 2 DEBUG oslo_utils.imageutils.format_inspector [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:17:39 compute-0 nova_compute[189265]: 2025-09-30 07:17:39.258 2 DEBUG oslo_concurrency.processutils [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:17:39 compute-0 nova_compute[189265]: 2025-09-30 07:17:39.314 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Migration for instance a62dd947-c757-461c-9dd7-2ccd8c8daf8c refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Sep 30 07:17:39 compute-0 nova_compute[189265]: 2025-09-30 07:17:39.315 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Migration for instance 3ad1a338-1146-48fa-a1fb-579e9b577b6c refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Sep 30 07:17:39 compute-0 nova_compute[189265]: 2025-09-30 07:17:39.351 2 DEBUG oslo_concurrency.processutils [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:17:39 compute-0 nova_compute[189265]: 2025-09-30 07:17:39.352 2 DEBUG oslo_concurrency.lockutils [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "649c128805005f3dfb5a93843c58a367cdfe939d" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:17:39 compute-0 nova_compute[189265]: 2025-09-30 07:17:39.353 2 DEBUG oslo_concurrency.lockutils [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "649c128805005f3dfb5a93843c58a367cdfe939d" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:17:39 compute-0 nova_compute[189265]: 2025-09-30 07:17:39.354 2 DEBUG oslo_utils.imageutils.format_inspector [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:17:39 compute-0 nova_compute[189265]: 2025-09-30 07:17:39.360 2 DEBUG oslo_utils.imageutils.format_inspector [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:17:39 compute-0 nova_compute[189265]: 2025-09-30 07:17:39.361 2 DEBUG oslo_concurrency.processutils [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:17:39 compute-0 nova_compute[189265]: 2025-09-30 07:17:39.415 2 DEBUG oslo_concurrency.processutils [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:17:39 compute-0 nova_compute[189265]: 2025-09-30 07:17:39.415 2 DEBUG oslo_concurrency.processutils [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d,backing_fmt=raw /var/lib/nova/instances/a62dd947-c757-461c-9dd7-2ccd8c8daf8c/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:17:39 compute-0 nova_compute[189265]: 2025-09-30 07:17:39.449 2 DEBUG oslo_concurrency.processutils [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d,backing_fmt=raw /var/lib/nova/instances/a62dd947-c757-461c-9dd7-2ccd8c8daf8c/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:17:39 compute-0 nova_compute[189265]: 2025-09-30 07:17:39.450 2 DEBUG oslo_concurrency.lockutils [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "649c128805005f3dfb5a93843c58a367cdfe939d" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.097s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:17:39 compute-0 nova_compute[189265]: 2025-09-30 07:17:39.451 2 DEBUG oslo_concurrency.processutils [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:17:39 compute-0 nova_compute[189265]: 2025-09-30 07:17:39.535 2 DEBUG oslo_concurrency.processutils [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:17:39 compute-0 nova_compute[189265]: 2025-09-30 07:17:39.538 2 DEBUG nova.virt.disk.api [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Checking if we can resize image /var/lib/nova/instances/a62dd947-c757-461c-9dd7-2ccd8c8daf8c/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Sep 30 07:17:39 compute-0 nova_compute[189265]: 2025-09-30 07:17:39.539 2 DEBUG oslo_concurrency.processutils [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a62dd947-c757-461c-9dd7-2ccd8c8daf8c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:17:39 compute-0 nova_compute[189265]: 2025-09-30 07:17:39.590 2 DEBUG oslo_concurrency.processutils [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a62dd947-c757-461c-9dd7-2ccd8c8daf8c/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:17:39 compute-0 nova_compute[189265]: 2025-09-30 07:17:39.590 2 DEBUG nova.virt.disk.api [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Cannot resize image /var/lib/nova/instances/a62dd947-c757-461c-9dd7-2ccd8c8daf8c/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Sep 30 07:17:39 compute-0 nova_compute[189265]: 2025-09-30 07:17:39.591 2 DEBUG nova.objects.instance [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lazy-loading 'migration_context' on Instance uuid a62dd947-c757-461c-9dd7-2ccd8c8daf8c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:17:39 compute-0 sshd-session[213707]: Failed password for root from 193.46.255.33 port 21952 ssh2
Sep 30 07:17:40 compute-0 nova_compute[189265]: 2025-09-30 07:17:40.101 2 DEBUG nova.objects.base [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Object Instance<a62dd947-c757-461c-9dd7-2ccd8c8daf8c> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Sep 30 07:17:40 compute-0 nova_compute[189265]: 2025-09-30 07:17:40.102 2 DEBUG oslo_concurrency.processutils [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/a62dd947-c757-461c-9dd7-2ccd8c8daf8c/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:17:40 compute-0 nova_compute[189265]: 2025-09-30 07:17:40.135 2 DEBUG oslo_concurrency.processutils [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/a62dd947-c757-461c-9dd7-2ccd8c8daf8c/disk.config 497664" returned: 0 in 0.033s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:17:40 compute-0 nova_compute[189265]: 2025-09-30 07:17:40.136 2 DEBUG nova.virt.libvirt.driver [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: a62dd947-c757-461c-9dd7-2ccd8c8daf8c] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Sep 30 07:17:40 compute-0 nova_compute[189265]: 2025-09-30 07:17:40.137 2 DEBUG nova.virt.libvirt.vif [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T07:15:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-653610675',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-653610675',id=4,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T07:16:00Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1413b21c2db845e58d8a81f524a55f3a',ramdisk_id='',reservation_id='r-rahezju2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-2061885601',owner_user_name='tempest-TestExecuteActionsViaActuator-2061885601-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T07:16:01Z,user_data=None,user_id='d6cb6be5d6fc407eb3abc1c7c70f5d77',uuid=a62dd947-c757-461c-9dd7-2ccd8c8daf8c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "50e9f0fc-d5c3-4230-aea5-ef47736ac58f", "address": "fa:16:3e:d0:55:ec", "network": {"id": "74ffbf65-ebbd-4587-bf5b-0b38421a4813", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1315246804-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1dc2a906d2242f79ffab81c2cf3c4d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap50e9f0fc-d5", "ovs_interfaceid": "50e9f0fc-d5c3-4230-aea5-ef47736ac58f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 07:17:40 compute-0 nova_compute[189265]: 2025-09-30 07:17:40.137 2 DEBUG nova.network.os_vif_util [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Converting VIF {"id": "50e9f0fc-d5c3-4230-aea5-ef47736ac58f", "address": "fa:16:3e:d0:55:ec", "network": {"id": "74ffbf65-ebbd-4587-bf5b-0b38421a4813", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1315246804-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1dc2a906d2242f79ffab81c2cf3c4d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap50e9f0fc-d5", "ovs_interfaceid": "50e9f0fc-d5c3-4230-aea5-ef47736ac58f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:17:40 compute-0 nova_compute[189265]: 2025-09-30 07:17:40.138 2 DEBUG nova.network.os_vif_util [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d0:55:ec,bridge_name='br-int',has_traffic_filtering=True,id=50e9f0fc-d5c3-4230-aea5-ef47736ac58f,network=Network(74ffbf65-ebbd-4587-bf5b-0b38421a4813),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50e9f0fc-d5') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:17:40 compute-0 nova_compute[189265]: 2025-09-30 07:17:40.138 2 DEBUG os_vif [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:55:ec,bridge_name='br-int',has_traffic_filtering=True,id=50e9f0fc-d5c3-4230-aea5-ef47736ac58f,network=Network(74ffbf65-ebbd-4587-bf5b-0b38421a4813),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50e9f0fc-d5') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 07:17:40 compute-0 nova_compute[189265]: 2025-09-30 07:17:40.139 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:40 compute-0 nova_compute[189265]: 2025-09-30 07:17:40.139 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:17:40 compute-0 nova_compute[189265]: 2025-09-30 07:17:40.140 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:17:40 compute-0 nova_compute[189265]: 2025-09-30 07:17:40.141 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:40 compute-0 nova_compute[189265]: 2025-09-30 07:17:40.141 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '78caddd4-4931-5316-86df-38d77821f04c', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:17:40 compute-0 nova_compute[189265]: 2025-09-30 07:17:40.142 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:40 compute-0 nova_compute[189265]: 2025-09-30 07:17:40.144 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:40 compute-0 nova_compute[189265]: 2025-09-30 07:17:40.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:40 compute-0 nova_compute[189265]: 2025-09-30 07:17:40.147 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap50e9f0fc-d5, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:17:40 compute-0 nova_compute[189265]: 2025-09-30 07:17:40.148 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap50e9f0fc-d5, col_values=(('qos', UUID('879cef01-d914-4ea3-8bb8-74fe164e6086')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:17:40 compute-0 nova_compute[189265]: 2025-09-30 07:17:40.148 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap50e9f0fc-d5, col_values=(('external_ids', {'iface-id': '50e9f0fc-d5c3-4230-aea5-ef47736ac58f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d0:55:ec', 'vm-uuid': 'a62dd947-c757-461c-9dd7-2ccd8c8daf8c'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:17:40 compute-0 nova_compute[189265]: 2025-09-30 07:17:40.149 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:40 compute-0 NetworkManager[51813]: <info>  [1759216660.1502] manager: (tap50e9f0fc-d5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Sep 30 07:17:40 compute-0 nova_compute[189265]: 2025-09-30 07:17:40.151 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:17:40 compute-0 nova_compute[189265]: 2025-09-30 07:17:40.156 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:40 compute-0 nova_compute[189265]: 2025-09-30 07:17:40.157 2 INFO os_vif [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d0:55:ec,bridge_name='br-int',has_traffic_filtering=True,id=50e9f0fc-d5c3-4230-aea5-ef47736ac58f,network=Network(74ffbf65-ebbd-4587-bf5b-0b38421a4813),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50e9f0fc-d5')
Sep 30 07:17:40 compute-0 nova_compute[189265]: 2025-09-30 07:17:40.158 2 DEBUG nova.virt.libvirt.driver [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Sep 30 07:17:40 compute-0 nova_compute[189265]: 2025-09-30 07:17:40.158 2 DEBUG nova.compute.manager [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpsgrlhdd3',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='a62dd947-c757-461c-9dd7-2ccd8c8daf8c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Sep 30 07:17:40 compute-0 nova_compute[189265]: 2025-09-30 07:17:40.159 2 WARNING neutronclient.v2_0.client [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:17:40 compute-0 nova_compute[189265]: 2025-09-30 07:17:40.246 2 WARNING neutronclient.v2_0.client [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:17:40 compute-0 nova_compute[189265]: 2025-09-30 07:17:40.354 2 INFO nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] [instance: a62dd947-c757-461c-9dd7-2ccd8c8daf8c] Updating resource usage from migration 79af3a1d-3fed-4083-a4f4-dcb6c21a3134
Sep 30 07:17:40 compute-0 nova_compute[189265]: 2025-09-30 07:17:40.355 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] [instance: a62dd947-c757-461c-9dd7-2ccd8c8daf8c] Starting to track incoming migration 79af3a1d-3fed-4083-a4f4-dcb6c21a3134 with flavor ded17455-f8fe-40c7-8dae-6f0a2b208ae0 _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Sep 30 07:17:40 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:17:40.599 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '1a:26:7c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2e:60:fa:91:d0:34'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:17:40 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:17:40.599 100322 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 07:17:40 compute-0 nova_compute[189265]: 2025-09-30 07:17:40.601 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:40 compute-0 nova_compute[189265]: 2025-09-30 07:17:40.890 2 INFO nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] Updating resource usage from migration 27698052-06fc-4032-8e09-3a1784ec1e1c
Sep 30 07:17:40 compute-0 nova_compute[189265]: 2025-09-30 07:17:40.890 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] Starting to track incoming migration 27698052-06fc-4032-8e09-3a1784ec1e1c with flavor ded17455-f8fe-40c7-8dae-6f0a2b208ae0 _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Sep 30 07:17:40 compute-0 nova_compute[189265]: 2025-09-30 07:17:40.930 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Instance 9fa193fb-a398-4552-85b4-a346dffcf697 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 07:17:40 compute-0 nova_compute[189265]: 2025-09-30 07:17:40.930 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Instance d40a0fba-a20e-4dcf-a048-10d9e21c6cf6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 07:17:40 compute-0 nova_compute[189265]: 2025-09-30 07:17:40.930 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Instance 992a0681-bc5e-40b3-adf3-305eee0718fd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 07:17:41 compute-0 nova_compute[189265]: 2025-09-30 07:17:41.437 2 WARNING nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Instance a62dd947-c757-461c-9dd7-2ccd8c8daf8c has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Sep 30 07:17:41 compute-0 sshd-session[213707]: Received disconnect from 193.46.255.33 port 21952:11:  [preauth]
Sep 30 07:17:41 compute-0 sshd-session[213707]: Disconnected from authenticating user root 193.46.255.33 port 21952 [preauth]
Sep 30 07:17:41 compute-0 sshd-session[213707]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.33  user=root
Sep 30 07:17:41 compute-0 sshd-session[213803]: Accepted publickey for nova from 192.168.122.101 port 43140 ssh2: ECDSA SHA256:kKBDNgxy0w2UAT9K/oU+qJBoJqb+wdTtNjxg0tFZ484
Sep 30 07:17:41 compute-0 systemd-logind[824]: New session 32 of user nova.
Sep 30 07:17:41 compute-0 systemd[1]: Started Session 32 of User nova.
Sep 30 07:17:41 compute-0 sshd-session[213803]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 07:17:41 compute-0 nova_compute[189265]: 2025-09-30 07:17:41.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:41 compute-0 nova_compute[189265]: 2025-09-30 07:17:41.963 2 WARNING nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Instance 3ad1a338-1146-48fa-a1fb-579e9b577b6c has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Sep 30 07:17:41 compute-0 nova_compute[189265]: 2025-09-30 07:17:41.963 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 5 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:17:41 compute-0 nova_compute[189265]: 2025-09-30 07:17:41.964 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=1152MB phys_disk=79GB used_disk=5GB total_vcpus=8 used_vcpus=5 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:17:38 up  1:15,  0 user,  load average: 0.54, 0.38, 0.46\n', 'num_instances': '3', 'num_vm_active': '3', 'num_task_None': '3', 'num_os_type_None': '3', 'num_proj_1413b21c2db845e58d8a81f524a55f3a': '3', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:17:42 compute-0 nova_compute[189265]: 2025-09-30 07:17:42.048 2 DEBUG nova.network.neutron [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: a62dd947-c757-461c-9dd7-2ccd8c8daf8c] Port 50e9f0fc-d5c3-4230-aea5-ef47736ac58f updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Sep 30 07:17:42 compute-0 nova_compute[189265]: 2025-09-30 07:17:42.079 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:17:42 compute-0 nova_compute[189265]: 2025-09-30 07:17:42.141 2 DEBUG nova.compute.manager [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=72704,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpsgrlhdd3',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='a62dd947-c757-461c-9dd7-2ccd8c8daf8c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Sep 30 07:17:42 compute-0 sshd-session[213806]: Received disconnect from 192.168.122.101 port 43140:11: disconnected by user
Sep 30 07:17:42 compute-0 sshd-session[213806]: Disconnected from user nova 192.168.122.101 port 43140
Sep 30 07:17:42 compute-0 sshd-session[213803]: pam_unix(sshd:session): session closed for user nova
Sep 30 07:17:42 compute-0 systemd[1]: session-32.scope: Deactivated successfully.
Sep 30 07:17:42 compute-0 systemd-logind[824]: Session 32 logged out. Waiting for processes to exit.
Sep 30 07:17:42 compute-0 systemd-logind[824]: Removed session 32.
Sep 30 07:17:42 compute-0 sshd-session[213808]: Accepted publickey for nova from 192.168.122.101 port 43154 ssh2: ECDSA SHA256:kKBDNgxy0w2UAT9K/oU+qJBoJqb+wdTtNjxg0tFZ484
Sep 30 07:17:42 compute-0 systemd-logind[824]: New session 33 of user nova.
Sep 30 07:17:42 compute-0 systemd[1]: Started Session 33 of User nova.
Sep 30 07:17:42 compute-0 sshd-session[213808]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 07:17:42 compute-0 sshd-session[213811]: Received disconnect from 192.168.122.101 port 43154:11: disconnected by user
Sep 30 07:17:42 compute-0 sshd-session[213811]: Disconnected from user nova 192.168.122.101 port 43154
Sep 30 07:17:42 compute-0 sshd-session[213808]: pam_unix(sshd:session): session closed for user nova
Sep 30 07:17:42 compute-0 systemd[1]: session-33.scope: Deactivated successfully.
Sep 30 07:17:42 compute-0 systemd-logind[824]: Session 33 logged out. Waiting for processes to exit.
Sep 30 07:17:42 compute-0 systemd-logind[824]: Removed session 33.
Sep 30 07:17:42 compute-0 sshd-session[213813]: Accepted publickey for nova from 192.168.122.101 port 43164 ssh2: ECDSA SHA256:kKBDNgxy0w2UAT9K/oU+qJBoJqb+wdTtNjxg0tFZ484
Sep 30 07:17:42 compute-0 systemd-logind[824]: New session 34 of user nova.
Sep 30 07:17:42 compute-0 nova_compute[189265]: 2025-09-30 07:17:42.647 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:17:42 compute-0 systemd[1]: Started Session 34 of User nova.
Sep 30 07:17:42 compute-0 sshd-session[213813]: pam_unix(sshd:session): session opened for user nova(uid=42436) by nova(uid=0)
Sep 30 07:17:42 compute-0 podman[213815]: 2025-09-30 07:17:42.718623943 +0000 UTC m=+0.084439012 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.build-date=20250930)
Sep 30 07:17:42 compute-0 sshd-session[213826]: Received disconnect from 192.168.122.101 port 43164:11: disconnected by user
Sep 30 07:17:42 compute-0 sshd-session[213826]: Disconnected from user nova 192.168.122.101 port 43164
Sep 30 07:17:42 compute-0 sshd-session[213813]: pam_unix(sshd:session): session closed for user nova
Sep 30 07:17:42 compute-0 nova_compute[189265]: 2025-09-30 07:17:42.725 2 DEBUG nova.compute.manager [req-e5303dae-056a-4106-8fc1-94011cb264eb req-afe5e5ed-cbe4-41e1-b26d-e36a6da07298 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] Received event network-vif-unplugged-d4e03551-f8cd-4604-990f-8c855bea77fa external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:17:42 compute-0 nova_compute[189265]: 2025-09-30 07:17:42.725 2 DEBUG oslo_concurrency.lockutils [req-e5303dae-056a-4106-8fc1-94011cb264eb req-afe5e5ed-cbe4-41e1-b26d-e36a6da07298 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "3ad1a338-1146-48fa-a1fb-579e9b577b6c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:17:42 compute-0 nova_compute[189265]: 2025-09-30 07:17:42.726 2 DEBUG oslo_concurrency.lockutils [req-e5303dae-056a-4106-8fc1-94011cb264eb req-afe5e5ed-cbe4-41e1-b26d-e36a6da07298 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "3ad1a338-1146-48fa-a1fb-579e9b577b6c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:17:42 compute-0 nova_compute[189265]: 2025-09-30 07:17:42.726 2 DEBUG oslo_concurrency.lockutils [req-e5303dae-056a-4106-8fc1-94011cb264eb req-afe5e5ed-cbe4-41e1-b26d-e36a6da07298 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "3ad1a338-1146-48fa-a1fb-579e9b577b6c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:17:42 compute-0 systemd[1]: session-34.scope: Deactivated successfully.
Sep 30 07:17:42 compute-0 nova_compute[189265]: 2025-09-30 07:17:42.726 2 DEBUG nova.compute.manager [req-e5303dae-056a-4106-8fc1-94011cb264eb req-afe5e5ed-cbe4-41e1-b26d-e36a6da07298 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] No waiting events found dispatching network-vif-unplugged-d4e03551-f8cd-4604-990f-8c855bea77fa pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:17:42 compute-0 nova_compute[189265]: 2025-09-30 07:17:42.728 2 WARNING nova.compute.manager [req-e5303dae-056a-4106-8fc1-94011cb264eb req-afe5e5ed-cbe4-41e1-b26d-e36a6da07298 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] Received unexpected event network-vif-unplugged-d4e03551-f8cd-4604-990f-8c855bea77fa for instance with vm_state active and task_state resize_migrating.
Sep 30 07:17:42 compute-0 nova_compute[189265]: 2025-09-30 07:17:42.728 2 DEBUG nova.compute.manager [req-e5303dae-056a-4106-8fc1-94011cb264eb req-afe5e5ed-cbe4-41e1-b26d-e36a6da07298 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] Received event network-vif-unplugged-d4e03551-f8cd-4604-990f-8c855bea77fa external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:17:42 compute-0 nova_compute[189265]: 2025-09-30 07:17:42.729 2 DEBUG oslo_concurrency.lockutils [req-e5303dae-056a-4106-8fc1-94011cb264eb req-afe5e5ed-cbe4-41e1-b26d-e36a6da07298 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "3ad1a338-1146-48fa-a1fb-579e9b577b6c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:17:42 compute-0 nova_compute[189265]: 2025-09-30 07:17:42.729 2 DEBUG oslo_concurrency.lockutils [req-e5303dae-056a-4106-8fc1-94011cb264eb req-afe5e5ed-cbe4-41e1-b26d-e36a6da07298 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "3ad1a338-1146-48fa-a1fb-579e9b577b6c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:17:42 compute-0 nova_compute[189265]: 2025-09-30 07:17:42.729 2 DEBUG oslo_concurrency.lockutils [req-e5303dae-056a-4106-8fc1-94011cb264eb req-afe5e5ed-cbe4-41e1-b26d-e36a6da07298 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "3ad1a338-1146-48fa-a1fb-579e9b577b6c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:17:42 compute-0 systemd-logind[824]: Session 34 logged out. Waiting for processes to exit.
Sep 30 07:17:42 compute-0 nova_compute[189265]: 2025-09-30 07:17:42.729 2 DEBUG nova.compute.manager [req-e5303dae-056a-4106-8fc1-94011cb264eb req-afe5e5ed-cbe4-41e1-b26d-e36a6da07298 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] No waiting events found dispatching network-vif-unplugged-d4e03551-f8cd-4604-990f-8c855bea77fa pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:17:42 compute-0 nova_compute[189265]: 2025-09-30 07:17:42.729 2 WARNING nova.compute.manager [req-e5303dae-056a-4106-8fc1-94011cb264eb req-afe5e5ed-cbe4-41e1-b26d-e36a6da07298 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] Received unexpected event network-vif-unplugged-d4e03551-f8cd-4604-990f-8c855bea77fa for instance with vm_state active and task_state resize_migrating.
Sep 30 07:17:42 compute-0 systemd-logind[824]: Removed session 34.
Sep 30 07:17:43 compute-0 nova_compute[189265]: 2025-09-30 07:17:43.180 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:17:43 compute-0 nova_compute[189265]: 2025-09-30 07:17:43.180 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.998s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:17:44 compute-0 systemd[1]: Starting libvirt proxy daemon...
Sep 30 07:17:44 compute-0 systemd[1]: Started libvirt proxy daemon.
Sep 30 07:17:44 compute-0 kernel: tap50e9f0fc-d5: entered promiscuous mode
Sep 30 07:17:44 compute-0 NetworkManager[51813]: <info>  [1759216664.9452] manager: (tap50e9f0fc-d5): new Tun device (/org/freedesktop/NetworkManager/Devices/31)
Sep 30 07:17:44 compute-0 nova_compute[189265]: 2025-09-30 07:17:44.947 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:44 compute-0 ovn_controller[91436]: 2025-09-30T07:17:44Z|00054|binding|INFO|Claiming lport 50e9f0fc-d5c3-4230-aea5-ef47736ac58f for this additional chassis.
Sep 30 07:17:44 compute-0 ovn_controller[91436]: 2025-09-30T07:17:44Z|00055|binding|INFO|50e9f0fc-d5c3-4230-aea5-ef47736ac58f: Claiming fa:16:3e:d0:55:ec 10.100.0.13
Sep 30 07:17:44 compute-0 nova_compute[189265]: 2025-09-30 07:17:44.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:44 compute-0 ovn_controller[91436]: 2025-09-30T07:17:44Z|00056|binding|INFO|Setting lport 50e9f0fc-d5c3-4230-aea5-ef47736ac58f ovn-installed in OVS
Sep 30 07:17:44 compute-0 nova_compute[189265]: 2025-09-30 07:17:44.963 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:44 compute-0 nova_compute[189265]: 2025-09-30 07:17:44.964 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:17:44.969 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d0:55:ec 10.100.0.13'], port_security=['fa:16:3e:d0:55:ec 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'a62dd947-c757-461c-9dd7-2ccd8c8daf8c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74ffbf65-ebbd-4587-bf5b-0b38421a4813', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1413b21c2db845e58d8a81f524a55f3a', 'neutron:revision_number': '10', 'neutron:security_group_ids': '8ad3c6f6-3842-4d69-92ac-cef07b75c3bc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7b541691-433c-426c-b8b7-10d79319603a, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[], logical_port=50e9f0fc-d5c3-4230-aea5-ef47736ac58f) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:17:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:17:44.972 100322 INFO neutron.agent.ovn.metadata.agent [-] Port 50e9f0fc-d5c3-4230-aea5-ef47736ac58f in datapath 74ffbf65-ebbd-4587-bf5b-0b38421a4813 unbound from our chassis
Sep 30 07:17:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:17:44.973 100322 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 74ffbf65-ebbd-4587-bf5b-0b38421a4813
Sep 30 07:17:44 compute-0 systemd-udevd[213868]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 07:17:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:17:44.991 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[0156bae6-c8e2-43bc-aa82-29fffd2345b3]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:17:45 compute-0 systemd-machined[149233]: New machine qemu-4-instance-00000004.
Sep 30 07:17:45 compute-0 NetworkManager[51813]: <info>  [1759216665.0035] device (tap50e9f0fc-d5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 07:17:45 compute-0 NetworkManager[51813]: <info>  [1759216665.0042] device (tap50e9f0fc-d5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 07:17:45 compute-0 systemd[1]: Started Virtual Machine qemu-4-instance-00000004.
Sep 30 07:17:45 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:17:45.036 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[27b2b809-818c-4dfb-b7ae-4b06a6b96c42]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:17:45 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:17:45.039 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[f6eb858e-99f3-4222-894d-bf1df11b60b9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:17:45 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:17:45.080 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[fb2306bd-f9bd-42b8-91ee-369f53d074ed]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:17:45 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:17:45.104 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[35ea5de4-e969-4240-a345-c447faf9ba66]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap74ffbf65-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1f:ef:af'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 14, 'tx_packets': 9, 'rx_bytes': 1084, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 14, 'tx_packets': 9, 'rx_bytes': 1084, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 434702, 'reachable_time': 35230, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213884, 'error': None, 'target': 'ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:17:45 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:17:45.128 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[9422b138-6407-48c5-99bb-d15acf10f01a]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap74ffbf65-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 434715, 'tstamp': 434715}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213885, 'error': None, 'target': 'ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap74ffbf65-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 434718, 'tstamp': 434718}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213885, 'error': None, 'target': 'ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:17:45 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:17:45.130 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap74ffbf65-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:17:45 compute-0 nova_compute[189265]: 2025-09-30 07:17:45.132 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:45 compute-0 nova_compute[189265]: 2025-09-30 07:17:45.133 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:45 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:17:45.134 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap74ffbf65-e0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:17:45 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:17:45.134 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:17:45 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:17:45.135 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap74ffbf65-e0, col_values=(('external_ids', {'iface-id': '0c700e20-e593-4a77-93d7-fc919dc1f294'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:17:45 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:17:45.135 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:17:45 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:17:45.137 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[8d5087a1-177c-4d70-af8b-3db4c2411893]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-74ffbf65-ebbd-4587-bf5b-0b38421a4813\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/74ffbf65-ebbd-4587-bf5b-0b38421a4813.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 74ffbf65-ebbd-4587-bf5b-0b38421a4813\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:17:45 compute-0 nova_compute[189265]: 2025-09-30 07:17:45.148 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:45 compute-0 nova_compute[189265]: 2025-09-30 07:17:45.266 2 WARNING neutronclient.v2_0.client [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:17:46 compute-0 nova_compute[189265]: 2025-09-30 07:17:46.238 2 INFO nova.network.neutron [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] Updating port d4e03551-f8cd-4604-990f-8c855bea77fa with attributes {'binding:host_id': 'compute-0.ctlplane.example.com', 'device_owner': 'compute:nova'}
Sep 30 07:17:46 compute-0 podman[213894]: 2025-09-30 07:17:46.476266707 +0000 UTC m=+0.058978359 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vendor=Red Hat, Inc., version=9.6, distribution-scope=public, release=1755695350, config_id=edpm, vcs-type=git, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container)
Sep 30 07:17:46 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:17:46.601 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=01429670-4ea1-4dab-babc-4bc628cc01bb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:17:46 compute-0 nova_compute[189265]: 2025-09-30 07:17:46.836 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:47 compute-0 nova_compute[189265]: 2025-09-30 07:17:47.553 2 DEBUG oslo_concurrency.lockutils [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "refresh_cache-3ad1a338-1146-48fa-a1fb-579e9b577b6c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:17:47 compute-0 nova_compute[189265]: 2025-09-30 07:17:47.554 2 DEBUG oslo_concurrency.lockutils [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquired lock "refresh_cache-3ad1a338-1146-48fa-a1fb-579e9b577b6c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:17:47 compute-0 nova_compute[189265]: 2025-09-30 07:17:47.554 2 DEBUG nova.network.neutron [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 07:17:47 compute-0 nova_compute[189265]: 2025-09-30 07:17:47.742 2 DEBUG nova.compute.manager [req-31d0b6f3-ff0b-44d9-bf85-03b22079f203 req-38867398-a863-433b-8b7e-05aa74c38607 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] Received event network-changed-d4e03551-f8cd-4604-990f-8c855bea77fa external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:17:47 compute-0 nova_compute[189265]: 2025-09-30 07:17:47.743 2 DEBUG nova.compute.manager [req-31d0b6f3-ff0b-44d9-bf85-03b22079f203 req-38867398-a863-433b-8b7e-05aa74c38607 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] Refreshing instance network info cache due to event network-changed-d4e03551-f8cd-4604-990f-8c855bea77fa. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 07:17:47 compute-0 nova_compute[189265]: 2025-09-30 07:17:47.743 2 DEBUG oslo_concurrency.lockutils [req-31d0b6f3-ff0b-44d9-bf85-03b22079f203 req-38867398-a863-433b-8b7e-05aa74c38607 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "refresh_cache-3ad1a338-1146-48fa-a1fb-579e9b577b6c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:17:48 compute-0 nova_compute[189265]: 2025-09-30 07:17:48.180 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:17:48 compute-0 nova_compute[189265]: 2025-09-30 07:17:48.226 2 WARNING neutronclient.v2_0.client [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:17:48 compute-0 nova_compute[189265]: 2025-09-30 07:17:48.687 2 WARNING neutronclient.v2_0.client [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:17:48 compute-0 ovn_controller[91436]: 2025-09-30T07:17:48Z|00057|binding|INFO|Claiming lport 50e9f0fc-d5c3-4230-aea5-ef47736ac58f for this chassis.
Sep 30 07:17:48 compute-0 ovn_controller[91436]: 2025-09-30T07:17:48Z|00058|binding|INFO|50e9f0fc-d5c3-4230-aea5-ef47736ac58f: Claiming fa:16:3e:d0:55:ec 10.100.0.13
Sep 30 07:17:48 compute-0 ovn_controller[91436]: 2025-09-30T07:17:48Z|00059|binding|INFO|Setting lport 50e9f0fc-d5c3-4230-aea5-ef47736ac58f up in Southbound
Sep 30 07:17:49 compute-0 nova_compute[189265]: 2025-09-30 07:17:49.372 2 DEBUG nova.network.neutron [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] Updating instance_info_cache with network_info: [{"id": "d4e03551-f8cd-4604-990f-8c855bea77fa", "address": "fa:16:3e:c9:35:b6", "network": {"id": "74ffbf65-ebbd-4587-bf5b-0b38421a4813", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1315246804-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1dc2a906d2242f79ffab81c2cf3c4d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4e03551-f8", "ovs_interfaceid": "d4e03551-f8cd-4604-990f-8c855bea77fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:17:49 compute-0 podman[213929]: 2025-09-30 07:17:49.482194551 +0000 UTC m=+0.065636904 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, tcib_build_tag=watcher_latest, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Sep 30 07:17:49 compute-0 podman[213930]: 2025-09-30 07:17:49.549830781 +0000 UTC m=+0.128992339 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS)
Sep 30 07:17:49 compute-0 nova_compute[189265]: 2025-09-30 07:17:49.878 2 DEBUG oslo_concurrency.lockutils [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Releasing lock "refresh_cache-3ad1a338-1146-48fa-a1fb-579e9b577b6c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:17:49 compute-0 nova_compute[189265]: 2025-09-30 07:17:49.885 2 DEBUG oslo_concurrency.lockutils [req-31d0b6f3-ff0b-44d9-bf85-03b22079f203 req-38867398-a863-433b-8b7e-05aa74c38607 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquired lock "refresh_cache-3ad1a338-1146-48fa-a1fb-579e9b577b6c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:17:49 compute-0 nova_compute[189265]: 2025-09-30 07:17:49.885 2 DEBUG nova.network.neutron [req-31d0b6f3-ff0b-44d9-bf85-03b22079f203 req-38867398-a863-433b-8b7e-05aa74c38607 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] Refreshing network info cache for port d4e03551-f8cd-4604-990f-8c855bea77fa _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 07:17:50 compute-0 nova_compute[189265]: 2025-09-30 07:17:50.149 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:50 compute-0 nova_compute[189265]: 2025-09-30 07:17:50.348 2 INFO nova.compute.manager [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: a62dd947-c757-461c-9dd7-2ccd8c8daf8c] Post operation of migration started
Sep 30 07:17:50 compute-0 nova_compute[189265]: 2025-09-30 07:17:50.349 2 WARNING neutronclient.v2_0.client [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:17:50 compute-0 nova_compute[189265]: 2025-09-30 07:17:50.396 2 WARNING neutronclient.v2_0.client [req-31d0b6f3-ff0b-44d9-bf85-03b22079f203 req-38867398-a863-433b-8b7e-05aa74c38607 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:17:50 compute-0 nova_compute[189265]: 2025-09-30 07:17:50.457 2 DEBUG nova.virt.libvirt.driver [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] Starting finish_migration finish_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12604
Sep 30 07:17:50 compute-0 nova_compute[189265]: 2025-09-30 07:17:50.460 2 DEBUG nova.virt.libvirt.driver [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] Instance directory exists: not creating _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5134
Sep 30 07:17:50 compute-0 nova_compute[189265]: 2025-09-30 07:17:50.460 2 INFO nova.virt.libvirt.driver [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] Creating image(s)
Sep 30 07:17:50 compute-0 nova_compute[189265]: 2025-09-30 07:17:50.461 2 DEBUG nova.objects.instance [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lazy-loading 'trusted_certs' on Instance uuid 3ad1a338-1146-48fa-a1fb-579e9b577b6c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:17:50 compute-0 nova_compute[189265]: 2025-09-30 07:17:50.969 2 DEBUG oslo_concurrency.processutils [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:17:51 compute-0 nova_compute[189265]: 2025-09-30 07:17:51.055 2 DEBUG oslo_concurrency.processutils [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:17:51 compute-0 nova_compute[189265]: 2025-09-30 07:17:51.057 2 DEBUG nova.virt.disk.api [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Checking if we can resize image /var/lib/nova/instances/3ad1a338-1146-48fa-a1fb-579e9b577b6c/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Sep 30 07:17:51 compute-0 nova_compute[189265]: 2025-09-30 07:17:51.058 2 DEBUG oslo_concurrency.processutils [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3ad1a338-1146-48fa-a1fb-579e9b577b6c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:17:51 compute-0 nova_compute[189265]: 2025-09-30 07:17:51.123 2 DEBUG oslo_concurrency.processutils [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3ad1a338-1146-48fa-a1fb-579e9b577b6c/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:17:51 compute-0 nova_compute[189265]: 2025-09-30 07:17:51.124 2 DEBUG nova.virt.disk.api [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Cannot resize image /var/lib/nova/instances/3ad1a338-1146-48fa-a1fb-579e9b577b6c/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Sep 30 07:17:51 compute-0 nova_compute[189265]: 2025-09-30 07:17:51.234 2 WARNING neutronclient.v2_0.client [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:17:51 compute-0 nova_compute[189265]: 2025-09-30 07:17:51.235 2 WARNING neutronclient.v2_0.client [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:17:51 compute-0 nova_compute[189265]: 2025-09-30 07:17:51.328 2 DEBUG oslo_concurrency.lockutils [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "refresh_cache-a62dd947-c757-461c-9dd7-2ccd8c8daf8c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:17:51 compute-0 nova_compute[189265]: 2025-09-30 07:17:51.329 2 DEBUG oslo_concurrency.lockutils [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquired lock "refresh_cache-a62dd947-c757-461c-9dd7-2ccd8c8daf8c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:17:51 compute-0 nova_compute[189265]: 2025-09-30 07:17:51.329 2 DEBUG nova.network.neutron [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: a62dd947-c757-461c-9dd7-2ccd8c8daf8c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 07:17:51 compute-0 podman[213982]: 2025-09-30 07:17:51.534697388 +0000 UTC m=+0.102047654 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true)
Sep 30 07:17:51 compute-0 nova_compute[189265]: 2025-09-30 07:17:51.635 2 DEBUG nova.virt.libvirt.driver [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] Did not create local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5272
Sep 30 07:17:51 compute-0 nova_compute[189265]: 2025-09-30 07:17:51.635 2 DEBUG nova.virt.libvirt.driver [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] Ensure instance console log exists: /var/lib/nova/instances/3ad1a338-1146-48fa-a1fb-579e9b577b6c/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 07:17:51 compute-0 nova_compute[189265]: 2025-09-30 07:17:51.636 2 DEBUG oslo_concurrency.lockutils [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:17:51 compute-0 nova_compute[189265]: 2025-09-30 07:17:51.637 2 DEBUG oslo_concurrency.lockutils [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:17:51 compute-0 nova_compute[189265]: 2025-09-30 07:17:51.637 2 DEBUG oslo_concurrency.lockutils [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:17:51 compute-0 nova_compute[189265]: 2025-09-30 07:17:51.641 2 DEBUG nova.virt.libvirt.driver [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] Start _get_guest_xml network_info=[{"id": "d4e03551-f8cd-4604-990f-8c855bea77fa", "address": "fa:16:3e:c9:35:b6", "network": {"id": "74ffbf65-ebbd-4587-bf5b-0b38421a4813", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1315246804-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1315246804-network", "vif_mac": "fa:16:3e:c9:35:b6"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1dc2a906d2242f79ffab81c2cf3c4d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4e03551-f8", "ovs_interfaceid": "d4e03551-f8cd-4604-990f-8c855bea77fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T07:07:59Z,direct_url=<?>,disk_format='qcow2',id=0c6b92f5-9861-49e4-862d-3ffd84520dfa,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4049964ce8244dacb50493f6676c6613',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T07:08:00Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'encryption_format': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encrypted': False, 'image_id': '0c6b92f5-9861-49e4-862d-3ffd84520dfa'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Sep 30 07:17:51 compute-0 nova_compute[189265]: 2025-09-30 07:17:51.648 2 WARNING nova.virt.libvirt.driver [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:17:51 compute-0 nova_compute[189265]: 2025-09-30 07:17:51.650 2 DEBUG nova.virt.driver [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='0c6b92f5-9861-49e4-862d-3ffd84520dfa', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteActionsViaActuator-server-1188797546', uuid='3ad1a338-1146-48fa-a1fb-579e9b577b6c'), owner=OwnerMeta(userid='d6cb6be5d6fc407eb3abc1c7c70f5d77', username='tempest-TestExecuteActionsViaActuator-2061885601-project-admin', projectid='1413b21c2db845e58d8a81f524a55f3a', projectname='tempest-TestExecuteActionsViaActuator-2061885601'), image=ImageMeta(id='0c6b92f5-9861-49e4-862d-3ffd84520dfa', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_cdrom_bus': 'sata', 'hw_disk_bus': 'virtio', 'hw_input_bus': 'usb', 'hw_machine_type': 'q35', 'hw_pointer_model': 'usbtablet', 'hw_rng_model': 'virtio', 'hw_video_model': 'virtio', 'hw_vif_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='ded17455-f8fe-40c7-8dae-6f0a2b208ae0', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "d4e03551-f8cd-4604-990f-8c855bea77fa", "address": "fa:16:3e:c9:35:b6", "network": {"id": "74ffbf65-ebbd-4587-bf5b-0b38421a4813", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1315246804-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1315246804-network", "vif_mac": "fa:16:3e:c9:35:b6"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1dc2a906d2242f79ffab81c2cf3c4d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4e03551-f8", "ovs_interfaceid": "d4e03551-f8cd-4604-990f-8c855bea77fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759216671.650538) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Sep 30 07:17:51 compute-0 nova_compute[189265]: 2025-09-30 07:17:51.656 2 DEBUG nova.virt.libvirt.host [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Sep 30 07:17:51 compute-0 nova_compute[189265]: 2025-09-30 07:17:51.657 2 DEBUG nova.virt.libvirt.host [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Sep 30 07:17:51 compute-0 nova_compute[189265]: 2025-09-30 07:17:51.661 2 DEBUG nova.virt.libvirt.host [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Sep 30 07:17:51 compute-0 nova_compute[189265]: 2025-09-30 07:17:51.662 2 DEBUG nova.virt.libvirt.host [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Sep 30 07:17:51 compute-0 nova_compute[189265]: 2025-09-30 07:17:51.663 2 DEBUG nova.virt.libvirt.driver [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 07:17:51 compute-0 nova_compute[189265]: 2025-09-30 07:17:51.664 2 DEBUG nova.virt.hardware [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T07:07:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='ded17455-f8fe-40c7-8dae-6f0a2b208ae0',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T07:07:59Z,direct_url=<?>,disk_format='qcow2',id=0c6b92f5-9861-49e4-862d-3ffd84520dfa,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4049964ce8244dacb50493f6676c6613',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T07:08:00Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Sep 30 07:17:51 compute-0 nova_compute[189265]: 2025-09-30 07:17:51.665 2 DEBUG nova.virt.hardware [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Sep 30 07:17:51 compute-0 nova_compute[189265]: 2025-09-30 07:17:51.665 2 DEBUG nova.virt.hardware [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Sep 30 07:17:51 compute-0 nova_compute[189265]: 2025-09-30 07:17:51.666 2 DEBUG nova.virt.hardware [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Sep 30 07:17:51 compute-0 nova_compute[189265]: 2025-09-30 07:17:51.666 2 DEBUG nova.virt.hardware [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Sep 30 07:17:51 compute-0 nova_compute[189265]: 2025-09-30 07:17:51.667 2 DEBUG nova.virt.hardware [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Sep 30 07:17:51 compute-0 nova_compute[189265]: 2025-09-30 07:17:51.667 2 DEBUG nova.virt.hardware [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Sep 30 07:17:51 compute-0 nova_compute[189265]: 2025-09-30 07:17:51.667 2 DEBUG nova.virt.hardware [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Sep 30 07:17:51 compute-0 nova_compute[189265]: 2025-09-30 07:17:51.668 2 DEBUG nova.virt.hardware [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Sep 30 07:17:51 compute-0 nova_compute[189265]: 2025-09-30 07:17:51.668 2 DEBUG nova.virt.hardware [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Sep 30 07:17:51 compute-0 nova_compute[189265]: 2025-09-30 07:17:51.669 2 DEBUG nova.virt.hardware [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Sep 30 07:17:51 compute-0 nova_compute[189265]: 2025-09-30 07:17:51.669 2 DEBUG nova.objects.instance [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lazy-loading 'vcpu_model' on Instance uuid 3ad1a338-1146-48fa-a1fb-579e9b577b6c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:17:51 compute-0 nova_compute[189265]: 2025-09-30 07:17:51.836 2 WARNING neutronclient.v2_0.client [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:17:51 compute-0 nova_compute[189265]: 2025-09-30 07:17:51.840 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:52 compute-0 nova_compute[189265]: 2025-09-30 07:17:52.177 2 DEBUG nova.objects.base [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Object Instance<3ad1a338-1146-48fa-a1fb-579e9b577b6c> lazy-loaded attributes: trusted_certs,vcpu_model wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Sep 30 07:17:52 compute-0 nova_compute[189265]: 2025-09-30 07:17:52.182 2 DEBUG oslo_concurrency.processutils [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3ad1a338-1146-48fa-a1fb-579e9b577b6c/disk.config --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:17:52 compute-0 nova_compute[189265]: 2025-09-30 07:17:52.254 2 DEBUG oslo_concurrency.processutils [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3ad1a338-1146-48fa-a1fb-579e9b577b6c/disk.config --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:17:52 compute-0 nova_compute[189265]: 2025-09-30 07:17:52.255 2 DEBUG oslo_concurrency.lockutils [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "/var/lib/nova/instances/3ad1a338-1146-48fa-a1fb-579e9b577b6c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:17:52 compute-0 nova_compute[189265]: 2025-09-30 07:17:52.256 2 DEBUG oslo_concurrency.lockutils [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "/var/lib/nova/instances/3ad1a338-1146-48fa-a1fb-579e9b577b6c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:17:52 compute-0 nova_compute[189265]: 2025-09-30 07:17:52.257 2 DEBUG oslo_concurrency.lockutils [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "/var/lib/nova/instances/3ad1a338-1146-48fa-a1fb-579e9b577b6c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:17:52 compute-0 nova_compute[189265]: 2025-09-30 07:17:52.258 2 DEBUG nova.virt.libvirt.vif [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-09-30T07:16:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1188797546',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1188797546',id=6,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T07:16:51Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1413b21c2db845e58d8a81f524a55f3a',ramdisk_id='',reservation_id='r-yngwu08y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-2061885601',owner_user_name='tempest-TestExecuteActionsViaActuator-2061885601-project-admin'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T07:17:43Z,user_data=None,user_id='d6cb6be5d6fc407eb3abc1c7c70f5d77',uuid=3ad1a338-1146-48fa-a1fb-579e9b577b6c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d4e03551-f8cd-4604-990f-8c855bea77fa", "address": "fa:16:3e:c9:35:b6", "network": {"id": "74ffbf65-ebbd-4587-bf5b-0b38421a4813", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1315246804-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1315246804-network", "vif_mac": "fa:16:3e:c9:35:b6"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1dc2a906d2242f79ffab81c2cf3c4d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4e03551-f8", "ovs_interfaceid": "d4e03551-f8cd-4604-990f-8c855bea77fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 07:17:52 compute-0 nova_compute[189265]: 2025-09-30 07:17:52.258 2 DEBUG nova.network.os_vif_util [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Converting VIF {"id": "d4e03551-f8cd-4604-990f-8c855bea77fa", "address": "fa:16:3e:c9:35:b6", "network": {"id": "74ffbf65-ebbd-4587-bf5b-0b38421a4813", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1315246804-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1315246804-network", "vif_mac": "fa:16:3e:c9:35:b6"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1dc2a906d2242f79ffab81c2cf3c4d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4e03551-f8", "ovs_interfaceid": "d4e03551-f8cd-4604-990f-8c855bea77fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:17:52 compute-0 nova_compute[189265]: 2025-09-30 07:17:52.259 2 DEBUG nova.network.os_vif_util [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c9:35:b6,bridge_name='br-int',has_traffic_filtering=True,id=d4e03551-f8cd-4604-990f-8c855bea77fa,network=Network(74ffbf65-ebbd-4587-bf5b-0b38421a4813),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4e03551-f8') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:17:52 compute-0 nova_compute[189265]: 2025-09-30 07:17:52.262 2 DEBUG nova.virt.libvirt.driver [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] End _get_guest_xml xml=<domain type="kvm">
Sep 30 07:17:52 compute-0 nova_compute[189265]:   <uuid>3ad1a338-1146-48fa-a1fb-579e9b577b6c</uuid>
Sep 30 07:17:52 compute-0 nova_compute[189265]:   <name>instance-00000006</name>
Sep 30 07:17:52 compute-0 nova_compute[189265]:   <memory>131072</memory>
Sep 30 07:17:52 compute-0 nova_compute[189265]:   <vcpu>1</vcpu>
Sep 30 07:17:52 compute-0 nova_compute[189265]:   <metadata>
Sep 30 07:17:52 compute-0 nova_compute[189265]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 07:17:52 compute-0 nova_compute[189265]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:       <nova:name>tempest-TestExecuteActionsViaActuator-server-1188797546</nova:name>
Sep 30 07:17:52 compute-0 nova_compute[189265]:       <nova:creationTime>2025-09-30 07:17:51</nova:creationTime>
Sep 30 07:17:52 compute-0 nova_compute[189265]:       <nova:flavor name="m1.nano" id="ded17455-f8fe-40c7-8dae-6f0a2b208ae0">
Sep 30 07:17:52 compute-0 nova_compute[189265]:         <nova:memory>128</nova:memory>
Sep 30 07:17:52 compute-0 nova_compute[189265]:         <nova:disk>1</nova:disk>
Sep 30 07:17:52 compute-0 nova_compute[189265]:         <nova:swap>0</nova:swap>
Sep 30 07:17:52 compute-0 nova_compute[189265]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 07:17:52 compute-0 nova_compute[189265]:         <nova:vcpus>1</nova:vcpus>
Sep 30 07:17:52 compute-0 nova_compute[189265]:         <nova:extraSpecs>
Sep 30 07:17:52 compute-0 nova_compute[189265]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 07:17:52 compute-0 nova_compute[189265]:         </nova:extraSpecs>
Sep 30 07:17:52 compute-0 nova_compute[189265]:       </nova:flavor>
Sep 30 07:17:52 compute-0 nova_compute[189265]:       <nova:image uuid="0c6b92f5-9861-49e4-862d-3ffd84520dfa">
Sep 30 07:17:52 compute-0 nova_compute[189265]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 07:17:52 compute-0 nova_compute[189265]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 07:17:52 compute-0 nova_compute[189265]:         <nova:minDisk>1</nova:minDisk>
Sep 30 07:17:52 compute-0 nova_compute[189265]:         <nova:minRam>0</nova:minRam>
Sep 30 07:17:52 compute-0 nova_compute[189265]:         <nova:properties>
Sep 30 07:17:52 compute-0 nova_compute[189265]:           <nova:property name="hw_cdrom_bus">sata</nova:property>
Sep 30 07:17:52 compute-0 nova_compute[189265]:           <nova:property name="hw_disk_bus">virtio</nova:property>
Sep 30 07:17:52 compute-0 nova_compute[189265]:           <nova:property name="hw_input_bus">usb</nova:property>
Sep 30 07:17:52 compute-0 nova_compute[189265]:           <nova:property name="hw_machine_type">q35</nova:property>
Sep 30 07:17:52 compute-0 nova_compute[189265]:           <nova:property name="hw_pointer_model">usbtablet</nova:property>
Sep 30 07:17:52 compute-0 nova_compute[189265]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 07:17:52 compute-0 nova_compute[189265]:           <nova:property name="hw_video_model">virtio</nova:property>
Sep 30 07:17:52 compute-0 nova_compute[189265]:           <nova:property name="hw_vif_model">virtio</nova:property>
Sep 30 07:17:52 compute-0 nova_compute[189265]:         </nova:properties>
Sep 30 07:17:52 compute-0 nova_compute[189265]:       </nova:image>
Sep 30 07:17:52 compute-0 nova_compute[189265]:       <nova:owner>
Sep 30 07:17:52 compute-0 nova_compute[189265]:         <nova:user uuid="d6cb6be5d6fc407eb3abc1c7c70f5d77">tempest-TestExecuteActionsViaActuator-2061885601-project-admin</nova:user>
Sep 30 07:17:52 compute-0 nova_compute[189265]:         <nova:project uuid="1413b21c2db845e58d8a81f524a55f3a">tempest-TestExecuteActionsViaActuator-2061885601</nova:project>
Sep 30 07:17:52 compute-0 nova_compute[189265]:       </nova:owner>
Sep 30 07:17:52 compute-0 nova_compute[189265]:       <nova:root type="image" uuid="0c6b92f5-9861-49e4-862d-3ffd84520dfa"/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:       <nova:ports>
Sep 30 07:17:52 compute-0 nova_compute[189265]:         <nova:port uuid="d4e03551-f8cd-4604-990f-8c855bea77fa">
Sep 30 07:17:52 compute-0 nova_compute[189265]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:         </nova:port>
Sep 30 07:17:52 compute-0 nova_compute[189265]:       </nova:ports>
Sep 30 07:17:52 compute-0 nova_compute[189265]:     </nova:instance>
Sep 30 07:17:52 compute-0 nova_compute[189265]:   </metadata>
Sep 30 07:17:52 compute-0 nova_compute[189265]:   <sysinfo type="smbios">
Sep 30 07:17:52 compute-0 nova_compute[189265]:     <system>
Sep 30 07:17:52 compute-0 nova_compute[189265]:       <entry name="manufacturer">RDO</entry>
Sep 30 07:17:52 compute-0 nova_compute[189265]:       <entry name="product">OpenStack Compute</entry>
Sep 30 07:17:52 compute-0 nova_compute[189265]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 07:17:52 compute-0 nova_compute[189265]:       <entry name="serial">3ad1a338-1146-48fa-a1fb-579e9b577b6c</entry>
Sep 30 07:17:52 compute-0 nova_compute[189265]:       <entry name="uuid">3ad1a338-1146-48fa-a1fb-579e9b577b6c</entry>
Sep 30 07:17:52 compute-0 nova_compute[189265]:       <entry name="family">Virtual Machine</entry>
Sep 30 07:17:52 compute-0 nova_compute[189265]:     </system>
Sep 30 07:17:52 compute-0 nova_compute[189265]:   </sysinfo>
Sep 30 07:17:52 compute-0 nova_compute[189265]:   <os>
Sep 30 07:17:52 compute-0 nova_compute[189265]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 07:17:52 compute-0 nova_compute[189265]:     <boot dev="hd"/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:     <smbios mode="sysinfo"/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:   </os>
Sep 30 07:17:52 compute-0 nova_compute[189265]:   <features>
Sep 30 07:17:52 compute-0 nova_compute[189265]:     <acpi/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:     <apic/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:     <vmcoreinfo/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:   </features>
Sep 30 07:17:52 compute-0 nova_compute[189265]:   <clock offset="utc">
Sep 30 07:17:52 compute-0 nova_compute[189265]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:     <timer name="hpet" present="no"/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:   </clock>
Sep 30 07:17:52 compute-0 nova_compute[189265]:   <cpu mode="host-model" match="exact">
Sep 30 07:17:52 compute-0 nova_compute[189265]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:   </cpu>
Sep 30 07:17:52 compute-0 nova_compute[189265]:   <devices>
Sep 30 07:17:52 compute-0 nova_compute[189265]:     <disk type="file" device="disk">
Sep 30 07:17:52 compute-0 nova_compute[189265]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:       <source file="/var/lib/nova/instances/3ad1a338-1146-48fa-a1fb-579e9b577b6c/disk"/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:       <target dev="vda" bus="virtio"/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:     </disk>
Sep 30 07:17:52 compute-0 nova_compute[189265]:     <disk type="file" device="cdrom">
Sep 30 07:17:52 compute-0 nova_compute[189265]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:       <source file="/var/lib/nova/instances/3ad1a338-1146-48fa-a1fb-579e9b577b6c/disk.config"/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:       <target dev="sda" bus="sata"/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:     </disk>
Sep 30 07:17:52 compute-0 nova_compute[189265]:     <interface type="ethernet">
Sep 30 07:17:52 compute-0 nova_compute[189265]:       <mac address="fa:16:3e:c9:35:b6"/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:       <model type="virtio"/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:       <mtu size="1442"/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:       <target dev="tapd4e03551-f8"/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:     </interface>
Sep 30 07:17:52 compute-0 nova_compute[189265]:     <serial type="pty">
Sep 30 07:17:52 compute-0 nova_compute[189265]:       <log file="/var/lib/nova/instances/3ad1a338-1146-48fa-a1fb-579e9b577b6c/console.log" append="off"/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:     </serial>
Sep 30 07:17:52 compute-0 nova_compute[189265]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:     <video>
Sep 30 07:17:52 compute-0 nova_compute[189265]:       <model type="virtio"/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:     </video>
Sep 30 07:17:52 compute-0 nova_compute[189265]:     <input type="tablet" bus="usb"/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:     <rng model="virtio">
Sep 30 07:17:52 compute-0 nova_compute[189265]:       <backend model="random">/dev/urandom</backend>
Sep 30 07:17:52 compute-0 nova_compute[189265]:     </rng>
Sep 30 07:17:52 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root"/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:     <controller type="usb" index="0"/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 07:17:52 compute-0 nova_compute[189265]:       <stats period="10"/>
Sep 30 07:17:52 compute-0 nova_compute[189265]:     </memballoon>
Sep 30 07:17:52 compute-0 nova_compute[189265]:   </devices>
Sep 30 07:17:52 compute-0 nova_compute[189265]: </domain>
Sep 30 07:17:52 compute-0 nova_compute[189265]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Sep 30 07:17:52 compute-0 nova_compute[189265]: 2025-09-30 07:17:52.265 2 DEBUG nova.virt.libvirt.vif [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-09-30T07:16:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1188797546',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1188797546',id=6,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T07:16:51Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='1413b21c2db845e58d8a81f524a55f3a',ramdisk_id='',reservation_id='r-yngwu08y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestExecuteActionsViaActuator-2061885601',owner_user_name='tempest-TestExecuteActionsViaActuator-2061885601-project-admin'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T07:17:43Z,user_data=None,user_id='d6cb6be5d6fc407eb3abc1c7c70f5d77',uuid=3ad1a338-1146-48fa-a1fb-579e9b577b6c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d4e03551-f8cd-4604-990f-8c855bea77fa", "address": "fa:16:3e:c9:35:b6", "network": {"id": "74ffbf65-ebbd-4587-bf5b-0b38421a4813", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1315246804-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1315246804-network", "vif_mac": "fa:16:3e:c9:35:b6"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1dc2a906d2242f79ffab81c2cf3c4d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4e03551-f8", "ovs_interfaceid": "d4e03551-f8cd-4604-990f-8c855bea77fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 07:17:52 compute-0 nova_compute[189265]: 2025-09-30 07:17:52.265 2 DEBUG nova.network.os_vif_util [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Converting VIF {"id": "d4e03551-f8cd-4604-990f-8c855bea77fa", "address": "fa:16:3e:c9:35:b6", "network": {"id": "74ffbf65-ebbd-4587-bf5b-0b38421a4813", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1315246804-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-TestExecuteActionsViaActuator-1315246804-network", "vif_mac": "fa:16:3e:c9:35:b6"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1dc2a906d2242f79ffab81c2cf3c4d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4e03551-f8", "ovs_interfaceid": "d4e03551-f8cd-4604-990f-8c855bea77fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:17:52 compute-0 nova_compute[189265]: 2025-09-30 07:17:52.266 2 DEBUG nova.network.os_vif_util [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c9:35:b6,bridge_name='br-int',has_traffic_filtering=True,id=d4e03551-f8cd-4604-990f-8c855bea77fa,network=Network(74ffbf65-ebbd-4587-bf5b-0b38421a4813),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4e03551-f8') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:17:52 compute-0 nova_compute[189265]: 2025-09-30 07:17:52.267 2 DEBUG os_vif [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c9:35:b6,bridge_name='br-int',has_traffic_filtering=True,id=d4e03551-f8cd-4604-990f-8c855bea77fa,network=Network(74ffbf65-ebbd-4587-bf5b-0b38421a4813),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4e03551-f8') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 07:17:52 compute-0 nova_compute[189265]: 2025-09-30 07:17:52.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:52 compute-0 nova_compute[189265]: 2025-09-30 07:17:52.269 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:17:52 compute-0 nova_compute[189265]: 2025-09-30 07:17:52.270 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:17:52 compute-0 nova_compute[189265]: 2025-09-30 07:17:52.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:52 compute-0 nova_compute[189265]: 2025-09-30 07:17:52.271 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '1d244813-0abe-5005-ba4a-234156ab0802', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:17:52 compute-0 nova_compute[189265]: 2025-09-30 07:17:52.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:52 compute-0 nova_compute[189265]: 2025-09-30 07:17:52.278 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:52 compute-0 nova_compute[189265]: 2025-09-30 07:17:52.278 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd4e03551-f8, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:17:52 compute-0 nova_compute[189265]: 2025-09-30 07:17:52.279 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapd4e03551-f8, col_values=(('qos', UUID('a509e0ae-a3ab-4024-8f26-5034ec138084')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:17:52 compute-0 nova_compute[189265]: 2025-09-30 07:17:52.279 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapd4e03551-f8, col_values=(('external_ids', {'iface-id': 'd4e03551-f8cd-4604-990f-8c855bea77fa', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c9:35:b6', 'vm-uuid': '3ad1a338-1146-48fa-a1fb-579e9b577b6c'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:17:52 compute-0 nova_compute[189265]: 2025-09-30 07:17:52.280 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:52 compute-0 NetworkManager[51813]: <info>  [1759216672.2818] manager: (tapd4e03551-f8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/32)
Sep 30 07:17:52 compute-0 nova_compute[189265]: 2025-09-30 07:17:52.283 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:17:52 compute-0 nova_compute[189265]: 2025-09-30 07:17:52.287 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:52 compute-0 nova_compute[189265]: 2025-09-30 07:17:52.288 2 INFO os_vif [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c9:35:b6,bridge_name='br-int',has_traffic_filtering=True,id=d4e03551-f8cd-4604-990f-8c855bea77fa,network=Network(74ffbf65-ebbd-4587-bf5b-0b38421a4813),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4e03551-f8')
Sep 30 07:17:52 compute-0 nova_compute[189265]: 2025-09-30 07:17:52.297 2 WARNING neutronclient.v2_0.client [req-31d0b6f3-ff0b-44d9-bf85-03b22079f203 req-38867398-a863-433b-8b7e-05aa74c38607 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:17:52 compute-0 nova_compute[189265]: 2025-09-30 07:17:52.477 2 DEBUG nova.network.neutron [req-31d0b6f3-ff0b-44d9-bf85-03b22079f203 req-38867398-a863-433b-8b7e-05aa74c38607 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] Updated VIF entry in instance network info cache for port d4e03551-f8cd-4604-990f-8c855bea77fa. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Sep 30 07:17:52 compute-0 nova_compute[189265]: 2025-09-30 07:17:52.478 2 DEBUG nova.network.neutron [req-31d0b6f3-ff0b-44d9-bf85-03b22079f203 req-38867398-a863-433b-8b7e-05aa74c38607 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] Updating instance_info_cache with network_info: [{"id": "d4e03551-f8cd-4604-990f-8c855bea77fa", "address": "fa:16:3e:c9:35:b6", "network": {"id": "74ffbf65-ebbd-4587-bf5b-0b38421a4813", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1315246804-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1dc2a906d2242f79ffab81c2cf3c4d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4e03551-f8", "ovs_interfaceid": "d4e03551-f8cd-4604-990f-8c855bea77fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:17:52 compute-0 systemd[1]: Stopping User Manager for UID 42436...
Sep 30 07:17:52 compute-0 systemd[213757]: Activating special unit Exit the Session...
Sep 30 07:17:52 compute-0 systemd[213757]: Stopped target Main User Target.
Sep 30 07:17:52 compute-0 systemd[213757]: Stopped target Basic System.
Sep 30 07:17:52 compute-0 systemd[213757]: Stopped target Paths.
Sep 30 07:17:52 compute-0 systemd[213757]: Stopped target Sockets.
Sep 30 07:17:52 compute-0 systemd[213757]: Stopped target Timers.
Sep 30 07:17:52 compute-0 systemd[213757]: Stopped Mark boot as successful after the user session has run 2 minutes.
Sep 30 07:17:52 compute-0 systemd[213757]: Stopped Daily Cleanup of User's Temporary Directories.
Sep 30 07:17:52 compute-0 systemd[213757]: Closed D-Bus User Message Bus Socket.
Sep 30 07:17:52 compute-0 systemd[213757]: Stopped Create User's Volatile Files and Directories.
Sep 30 07:17:52 compute-0 systemd[213757]: Removed slice User Application Slice.
Sep 30 07:17:52 compute-0 systemd[213757]: Reached target Shutdown.
Sep 30 07:17:52 compute-0 systemd[213757]: Finished Exit the Session.
Sep 30 07:17:52 compute-0 systemd[213757]: Reached target Exit the Session.
Sep 30 07:17:52 compute-0 systemd[1]: user@42436.service: Deactivated successfully.
Sep 30 07:17:52 compute-0 systemd[1]: Stopped User Manager for UID 42436.
Sep 30 07:17:52 compute-0 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Sep 30 07:17:52 compute-0 systemd[1]: run-user-42436.mount: Deactivated successfully.
Sep 30 07:17:52 compute-0 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Sep 30 07:17:52 compute-0 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Sep 30 07:17:52 compute-0 systemd[1]: Removed slice User Slice of UID 42436.
Sep 30 07:17:53 compute-0 nova_compute[189265]: 2025-09-30 07:17:53.013 2 DEBUG oslo_concurrency.lockutils [req-31d0b6f3-ff0b-44d9-bf85-03b22079f203 req-38867398-a863-433b-8b7e-05aa74c38607 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Releasing lock "refresh_cache-3ad1a338-1146-48fa-a1fb-579e9b577b6c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:17:53 compute-0 nova_compute[189265]: 2025-09-30 07:17:53.291 2 WARNING neutronclient.v2_0.client [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:17:53 compute-0 nova_compute[189265]: 2025-09-30 07:17:53.508 2 DEBUG nova.network.neutron [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: a62dd947-c757-461c-9dd7-2ccd8c8daf8c] Updating instance_info_cache with network_info: [{"id": "50e9f0fc-d5c3-4230-aea5-ef47736ac58f", "address": "fa:16:3e:d0:55:ec", "network": {"id": "74ffbf65-ebbd-4587-bf5b-0b38421a4813", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1315246804-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1dc2a906d2242f79ffab81c2cf3c4d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50e9f0fc-d5", "ovs_interfaceid": "50e9f0fc-d5c3-4230-aea5-ef47736ac58f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:17:54 compute-0 nova_compute[189265]: 2025-09-30 07:17:54.067 2 DEBUG nova.virt.libvirt.driver [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 07:17:54 compute-0 nova_compute[189265]: 2025-09-30 07:17:54.068 2 DEBUG nova.virt.libvirt.driver [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 07:17:54 compute-0 nova_compute[189265]: 2025-09-30 07:17:54.068 2 DEBUG nova.virt.libvirt.driver [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] No VIF found with MAC fa:16:3e:c9:35:b6, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 07:17:54 compute-0 nova_compute[189265]: 2025-09-30 07:17:54.070 2 INFO nova.virt.libvirt.driver [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] Using config drive
Sep 30 07:17:54 compute-0 nova_compute[189265]: 2025-09-30 07:17:54.100 2 DEBUG oslo_concurrency.lockutils [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Releasing lock "refresh_cache-a62dd947-c757-461c-9dd7-2ccd8c8daf8c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:17:54 compute-0 kernel: tapd4e03551-f8: entered promiscuous mode
Sep 30 07:17:54 compute-0 ovn_controller[91436]: 2025-09-30T07:17:54Z|00060|binding|INFO|Claiming lport d4e03551-f8cd-4604-990f-8c855bea77fa for this chassis.
Sep 30 07:17:54 compute-0 NetworkManager[51813]: <info>  [1759216674.1626] manager: (tapd4e03551-f8): new Tun device (/org/freedesktop/NetworkManager/Devices/33)
Sep 30 07:17:54 compute-0 ovn_controller[91436]: 2025-09-30T07:17:54Z|00061|binding|INFO|d4e03551-f8cd-4604-990f-8c855bea77fa: Claiming fa:16:3e:c9:35:b6 10.100.0.12
Sep 30 07:17:54 compute-0 nova_compute[189265]: 2025-09-30 07:17:54.163 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:54 compute-0 ovn_controller[91436]: 2025-09-30T07:17:54Z|00062|binding|INFO|Setting lport d4e03551-f8cd-4604-990f-8c855bea77fa ovn-installed in OVS
Sep 30 07:17:54 compute-0 nova_compute[189265]: 2025-09-30 07:17:54.182 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:54 compute-0 nova_compute[189265]: 2025-09-30 07:17:54.184 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:54 compute-0 systemd-udevd[214024]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 07:17:54 compute-0 systemd-machined[149233]: New machine qemu-5-instance-00000006.
Sep 30 07:17:54 compute-0 ovn_controller[91436]: 2025-09-30T07:17:54Z|00063|binding|INFO|Setting lport d4e03551-f8cd-4604-990f-8c855bea77fa up in Southbound
Sep 30 07:17:54 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:17:54.219 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c9:35:b6 10.100.0.12'], port_security=['fa:16:3e:c9:35:b6 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '3ad1a338-1146-48fa-a1fb-579e9b577b6c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74ffbf65-ebbd-4587-bf5b-0b38421a4813', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1413b21c2db845e58d8a81f524a55f3a', 'neutron:revision_number': '9', 'neutron:security_group_ids': '8ad3c6f6-3842-4d69-92ac-cef07b75c3bc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7b541691-433c-426c-b8b7-10d79319603a, chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], logical_port=d4e03551-f8cd-4604-990f-8c855bea77fa) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:17:54 compute-0 NetworkManager[51813]: <info>  [1759216674.2230] device (tapd4e03551-f8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 07:17:54 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:17:54.222 100322 INFO neutron.agent.ovn.metadata.agent [-] Port d4e03551-f8cd-4604-990f-8c855bea77fa in datapath 74ffbf65-ebbd-4587-bf5b-0b38421a4813 bound to our chassis
Sep 30 07:17:54 compute-0 systemd[1]: Started Virtual Machine qemu-5-instance-00000006.
Sep 30 07:17:54 compute-0 NetworkManager[51813]: <info>  [1759216674.2253] device (tapd4e03551-f8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 07:17:54 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:17:54.226 100322 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 74ffbf65-ebbd-4587-bf5b-0b38421a4813
Sep 30 07:17:54 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:17:54.249 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[b6c7fa67-d632-4ccf-af78-c57954568fb5]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:17:54 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:17:54.292 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[0ebbe64b-dd59-40d7-a83b-f3a72caf3ce5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:17:54 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:17:54.294 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[d967f6a1-028d-451b-ae62-ad678bae6b6e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:17:54 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:17:54.325 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[59ed1cbb-b64f-4eea-b13b-b01f832d4750]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:17:54 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:17:54.351 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[2105c601-6ef3-45c9-b9fc-68fc77db7c67]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap74ffbf65-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1f:ef:af'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 19, 'tx_packets': 11, 'rx_bytes': 1294, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 19, 'tx_packets': 11, 'rx_bytes': 1294, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 434702, 'reachable_time': 35230, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214038, 'error': None, 'target': 'ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:17:54 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:17:54.375 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[b8ef7d5b-7e39-4542-b2b4-66887867911a]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap74ffbf65-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 434715, 'tstamp': 434715}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214039, 'error': None, 'target': 'ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap74ffbf65-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 434718, 'tstamp': 434718}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214039, 'error': None, 'target': 'ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:17:54 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:17:54.377 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap74ffbf65-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:17:54 compute-0 nova_compute[189265]: 2025-09-30 07:17:54.380 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:54 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:17:54.382 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap74ffbf65-e0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:17:54 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:17:54.383 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:17:54 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:17:54.383 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap74ffbf65-e0, col_values=(('external_ids', {'iface-id': '0c700e20-e593-4a77-93d7-fc919dc1f294'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:17:54 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:17:54.384 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:17:54 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:17:54.386 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[2432b404-f9f6-4929-ba86-624caa44db5c]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-74ffbf65-ebbd-4587-bf5b-0b38421a4813\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/74ffbf65-ebbd-4587-bf5b-0b38421a4813.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 74ffbf65-ebbd-4587-bf5b-0b38421a4813\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:17:54 compute-0 nova_compute[189265]: 2025-09-30 07:17:54.791 2 DEBUG nova.compute.manager [req-2f2833e3-1311-470f-84ac-f8634dd4af3e req-feec0cc2-8f63-4d54-b93e-61c8db47261f 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] Received event network-vif-plugged-d4e03551-f8cd-4604-990f-8c855bea77fa external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:17:54 compute-0 nova_compute[189265]: 2025-09-30 07:17:54.791 2 DEBUG oslo_concurrency.lockutils [req-2f2833e3-1311-470f-84ac-f8634dd4af3e req-feec0cc2-8f63-4d54-b93e-61c8db47261f 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "3ad1a338-1146-48fa-a1fb-579e9b577b6c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:17:54 compute-0 nova_compute[189265]: 2025-09-30 07:17:54.792 2 DEBUG oslo_concurrency.lockutils [req-2f2833e3-1311-470f-84ac-f8634dd4af3e req-feec0cc2-8f63-4d54-b93e-61c8db47261f 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "3ad1a338-1146-48fa-a1fb-579e9b577b6c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:17:54 compute-0 nova_compute[189265]: 2025-09-30 07:17:54.792 2 DEBUG oslo_concurrency.lockutils [req-2f2833e3-1311-470f-84ac-f8634dd4af3e req-feec0cc2-8f63-4d54-b93e-61c8db47261f 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "3ad1a338-1146-48fa-a1fb-579e9b577b6c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:17:54 compute-0 nova_compute[189265]: 2025-09-30 07:17:54.792 2 DEBUG nova.compute.manager [req-2f2833e3-1311-470f-84ac-f8634dd4af3e req-feec0cc2-8f63-4d54-b93e-61c8db47261f 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] No waiting events found dispatching network-vif-plugged-d4e03551-f8cd-4604-990f-8c855bea77fa pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:17:54 compute-0 nova_compute[189265]: 2025-09-30 07:17:54.793 2 WARNING nova.compute.manager [req-2f2833e3-1311-470f-84ac-f8634dd4af3e req-feec0cc2-8f63-4d54-b93e-61c8db47261f 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] Received unexpected event network-vif-plugged-d4e03551-f8cd-4604-990f-8c855bea77fa for instance with vm_state active and task_state resize_finish.
Sep 30 07:17:54 compute-0 nova_compute[189265]: 2025-09-30 07:17:54.902 2 DEBUG oslo_concurrency.lockutils [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:17:54 compute-0 nova_compute[189265]: 2025-09-30 07:17:54.903 2 DEBUG oslo_concurrency.lockutils [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:17:54 compute-0 nova_compute[189265]: 2025-09-30 07:17:54.903 2 DEBUG oslo_concurrency.lockutils [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:17:54 compute-0 nova_compute[189265]: 2025-09-30 07:17:54.909 2 INFO nova.virt.libvirt.driver [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: a62dd947-c757-461c-9dd7-2ccd8c8daf8c] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Sep 30 07:17:54 compute-0 virtqemud[189090]: Domain id=4 name='instance-00000004' uuid=a62dd947-c757-461c-9dd7-2ccd8c8daf8c is tainted: custom-monitor
Sep 30 07:17:55 compute-0 nova_compute[189265]: 2025-09-30 07:17:55.826 2 DEBUG nova.compute.manager [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 07:17:55 compute-0 nova_compute[189265]: 2025-09-30 07:17:55.829 2 INFO nova.virt.libvirt.driver [-] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] Instance running successfully.
Sep 30 07:17:55 compute-0 virtqemud[189090]: argument unsupported: QEMU guest agent is not configured
Sep 30 07:17:55 compute-0 nova_compute[189265]: 2025-09-30 07:17:55.831 2 DEBUG nova.virt.libvirt.guest [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:200
Sep 30 07:17:55 compute-0 nova_compute[189265]: 2025-09-30 07:17:55.831 2 DEBUG nova.virt.libvirt.driver [None req-b9b37d3d-20c8-42d2-ba2d-92760d14ea6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] finish_migration finished successfully. finish_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12699
Sep 30 07:17:55 compute-0 nova_compute[189265]: 2025-09-30 07:17:55.918 2 INFO nova.virt.libvirt.driver [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: a62dd947-c757-461c-9dd7-2ccd8c8daf8c] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Sep 30 07:17:56 compute-0 nova_compute[189265]: 2025-09-30 07:17:56.845 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:56 compute-0 nova_compute[189265]: 2025-09-30 07:17:56.922 2 INFO nova.virt.libvirt.driver [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: a62dd947-c757-461c-9dd7-2ccd8c8daf8c] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Sep 30 07:17:56 compute-0 nova_compute[189265]: 2025-09-30 07:17:56.927 2 DEBUG nova.compute.manager [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: a62dd947-c757-461c-9dd7-2ccd8c8daf8c] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 07:17:57 compute-0 nova_compute[189265]: 2025-09-30 07:17:57.159 2 DEBUG nova.compute.manager [req-449b222a-a93c-4924-aa5c-a26b4f8ffd3b req-5274bd13-7341-47ac-a938-c41ae5fa82a3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] Received event network-vif-plugged-d4e03551-f8cd-4604-990f-8c855bea77fa external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:17:57 compute-0 nova_compute[189265]: 2025-09-30 07:17:57.159 2 DEBUG oslo_concurrency.lockutils [req-449b222a-a93c-4924-aa5c-a26b4f8ffd3b req-5274bd13-7341-47ac-a938-c41ae5fa82a3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "3ad1a338-1146-48fa-a1fb-579e9b577b6c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:17:57 compute-0 nova_compute[189265]: 2025-09-30 07:17:57.160 2 DEBUG oslo_concurrency.lockutils [req-449b222a-a93c-4924-aa5c-a26b4f8ffd3b req-5274bd13-7341-47ac-a938-c41ae5fa82a3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "3ad1a338-1146-48fa-a1fb-579e9b577b6c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:17:57 compute-0 nova_compute[189265]: 2025-09-30 07:17:57.160 2 DEBUG oslo_concurrency.lockutils [req-449b222a-a93c-4924-aa5c-a26b4f8ffd3b req-5274bd13-7341-47ac-a938-c41ae5fa82a3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "3ad1a338-1146-48fa-a1fb-579e9b577b6c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:17:57 compute-0 nova_compute[189265]: 2025-09-30 07:17:57.160 2 DEBUG nova.compute.manager [req-449b222a-a93c-4924-aa5c-a26b4f8ffd3b req-5274bd13-7341-47ac-a938-c41ae5fa82a3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] No waiting events found dispatching network-vif-plugged-d4e03551-f8cd-4604-990f-8c855bea77fa pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:17:57 compute-0 nova_compute[189265]: 2025-09-30 07:17:57.160 2 WARNING nova.compute.manager [req-449b222a-a93c-4924-aa5c-a26b4f8ffd3b req-5274bd13-7341-47ac-a938-c41ae5fa82a3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] Received unexpected event network-vif-plugged-d4e03551-f8cd-4604-990f-8c855bea77fa for instance with vm_state resized and task_state None.
Sep 30 07:17:57 compute-0 nova_compute[189265]: 2025-09-30 07:17:57.281 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:17:57 compute-0 nova_compute[189265]: 2025-09-30 07:17:57.458 2 DEBUG nova.objects.instance [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: a62dd947-c757-461c-9dd7-2ccd8c8daf8c] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Sep 30 07:17:58 compute-0 nova_compute[189265]: 2025-09-30 07:17:58.535 2 WARNING neutronclient.v2_0.client [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:17:59 compute-0 nova_compute[189265]: 2025-09-30 07:17:59.208 2 WARNING neutronclient.v2_0.client [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:17:59 compute-0 nova_compute[189265]: 2025-09-30 07:17:59.209 2 WARNING neutronclient.v2_0.client [None req-42cd2ce7-990c-4cce-9ab5-295920036048 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:17:59 compute-0 podman[199733]: time="2025-09-30T07:17:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:17:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:17:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 07:17:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:17:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3464 "" "Go-http-client/1.1"
Sep 30 07:18:01 compute-0 openstack_network_exporter[201859]: ERROR   07:18:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:18:01 compute-0 openstack_network_exporter[201859]: ERROR   07:18:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:18:01 compute-0 openstack_network_exporter[201859]: ERROR   07:18:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:18:01 compute-0 openstack_network_exporter[201859]: ERROR   07:18:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:18:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:18:01 compute-0 openstack_network_exporter[201859]: ERROR   07:18:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:18:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:18:01 compute-0 nova_compute[189265]: 2025-09-30 07:18:01.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:02 compute-0 nova_compute[189265]: 2025-09-30 07:18:02.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:04 compute-0 podman[214049]: 2025-09-30 07:18:04.483518886 +0000 UTC m=+0.066408176 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 07:18:06 compute-0 nova_compute[189265]: 2025-09-30 07:18:06.850 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:07 compute-0 nova_compute[189265]: 2025-09-30 07:18:07.324 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:07 compute-0 ovn_controller[91436]: 2025-09-30T07:18:07Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c9:35:b6 10.100.0.12
Sep 30 07:18:11 compute-0 nova_compute[189265]: 2025-09-30 07:18:11.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:12 compute-0 nova_compute[189265]: 2025-09-30 07:18:12.326 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:12 compute-0 nova_compute[189265]: 2025-09-30 07:18:12.879 2 DEBUG oslo_concurrency.lockutils [None req-1699a42a-3ed3-4a34-ad59-ab7d3e5dbbb4 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Acquiring lock "992a0681-bc5e-40b3-adf3-305eee0718fd" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:18:12 compute-0 nova_compute[189265]: 2025-09-30 07:18:12.880 2 DEBUG oslo_concurrency.lockutils [None req-1699a42a-3ed3-4a34-ad59-ab7d3e5dbbb4 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "992a0681-bc5e-40b3-adf3-305eee0718fd" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:18:12 compute-0 nova_compute[189265]: 2025-09-30 07:18:12.880 2 DEBUG oslo_concurrency.lockutils [None req-1699a42a-3ed3-4a34-ad59-ab7d3e5dbbb4 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Acquiring lock "992a0681-bc5e-40b3-adf3-305eee0718fd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:18:12 compute-0 nova_compute[189265]: 2025-09-30 07:18:12.881 2 DEBUG oslo_concurrency.lockutils [None req-1699a42a-3ed3-4a34-ad59-ab7d3e5dbbb4 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "992a0681-bc5e-40b3-adf3-305eee0718fd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:18:12 compute-0 nova_compute[189265]: 2025-09-30 07:18:12.881 2 DEBUG oslo_concurrency.lockutils [None req-1699a42a-3ed3-4a34-ad59-ab7d3e5dbbb4 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "992a0681-bc5e-40b3-adf3-305eee0718fd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:18:12 compute-0 nova_compute[189265]: 2025-09-30 07:18:12.900 2 INFO nova.compute.manager [None req-1699a42a-3ed3-4a34-ad59-ab7d3e5dbbb4 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Terminating instance
Sep 30 07:18:13 compute-0 nova_compute[189265]: 2025-09-30 07:18:13.421 2 DEBUG nova.compute.manager [None req-1699a42a-3ed3-4a34-ad59-ab7d3e5dbbb4 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 07:18:13 compute-0 kernel: tapa38c613b-6d (unregistering): left promiscuous mode
Sep 30 07:18:13 compute-0 NetworkManager[51813]: <info>  [1759216693.4516] device (tapa38c613b-6d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 07:18:13 compute-0 nova_compute[189265]: 2025-09-30 07:18:13.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:13 compute-0 ovn_controller[91436]: 2025-09-30T07:18:13Z|00064|binding|INFO|Releasing lport a38c613b-6d8b-4dc3-96e4-4a4103b20d91 from this chassis (sb_readonly=0)
Sep 30 07:18:13 compute-0 ovn_controller[91436]: 2025-09-30T07:18:13Z|00065|binding|INFO|Setting lport a38c613b-6d8b-4dc3-96e4-4a4103b20d91 down in Southbound
Sep 30 07:18:13 compute-0 ovn_controller[91436]: 2025-09-30T07:18:13Z|00066|binding|INFO|Removing iface tapa38c613b-6d ovn-installed in OVS
Sep 30 07:18:13 compute-0 nova_compute[189265]: 2025-09-30 07:18:13.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:13.497 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ec:ab:8a 10.100.0.7'], port_security=['fa:16:3e:ec:ab:8a 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '992a0681-bc5e-40b3-adf3-305eee0718fd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74ffbf65-ebbd-4587-bf5b-0b38421a4813', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1413b21c2db845e58d8a81f524a55f3a', 'neutron:revision_number': '5', 'neutron:security_group_ids': '8ad3c6f6-3842-4d69-92ac-cef07b75c3bc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7b541691-433c-426c-b8b7-10d79319603a, chassis=[], tunnel_key=8, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], logical_port=a38c613b-6d8b-4dc3-96e4-4a4103b20d91) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:18:13 compute-0 nova_compute[189265]: 2025-09-30 07:18:13.497 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:13.499 100322 INFO neutron.agent.ovn.metadata.agent [-] Port a38c613b-6d8b-4dc3-96e4-4a4103b20d91 in datapath 74ffbf65-ebbd-4587-bf5b-0b38421a4813 unbound from our chassis
Sep 30 07:18:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:13.502 100322 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 74ffbf65-ebbd-4587-bf5b-0b38421a4813
Sep 30 07:18:13 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000007.scope: Deactivated successfully.
Sep 30 07:18:13 compute-0 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000007.scope: Consumed 14.755s CPU time.
Sep 30 07:18:13 compute-0 systemd-machined[149233]: Machine qemu-3-instance-00000007 terminated.
Sep 30 07:18:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:13.524 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[4d5b4da2-3940-4e9e-94fd-ba267804bd1a]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:18:13 compute-0 podman[214083]: 2025-09-30 07:18:13.532765856 +0000 UTC m=+0.112911351 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 07:18:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:13.567 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[4dc28c95-ab7d-4789-8488-88d9a6d7ce6a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:18:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:13.570 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[2c72d3a5-702a-417b-b1c4-908c329be79c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:18:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:13.615 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[ba9dcb08-7380-4b0d-b39e-d4c7027d8302]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:18:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:13.629 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[c2a5a0ee-51b4-41f2-b743-100199f3b213]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap74ffbf65-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1f:ef:af'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 35, 'tx_packets': 13, 'rx_bytes': 1966, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 35, 'tx_packets': 13, 'rx_bytes': 1966, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 434702, 'reachable_time': 35230, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214114, 'error': None, 'target': 'ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:18:13 compute-0 kernel: tapa38c613b-6d: entered promiscuous mode
Sep 30 07:18:13 compute-0 NetworkManager[51813]: <info>  [1759216693.6441] manager: (tapa38c613b-6d): new Tun device (/org/freedesktop/NetworkManager/Devices/34)
Sep 30 07:18:13 compute-0 kernel: tapa38c613b-6d (unregistering): left promiscuous mode
Sep 30 07:18:13 compute-0 nova_compute[189265]: 2025-09-30 07:18:13.646 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:13 compute-0 ovn_controller[91436]: 2025-09-30T07:18:13Z|00067|binding|INFO|Claiming lport a38c613b-6d8b-4dc3-96e4-4a4103b20d91 for this chassis.
Sep 30 07:18:13 compute-0 ovn_controller[91436]: 2025-09-30T07:18:13Z|00068|binding|INFO|a38c613b-6d8b-4dc3-96e4-4a4103b20d91: Claiming fa:16:3e:ec:ab:8a 10.100.0.7
Sep 30 07:18:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:13.652 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[2434c10b-3ead-4506-a352-a286f783bfe3]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap74ffbf65-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 434715, 'tstamp': 434715}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214116, 'error': None, 'target': 'ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap74ffbf65-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 434718, 'tstamp': 434718}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214116, 'error': None, 'target': 'ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:18:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:13.653 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap74ffbf65-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:18:13 compute-0 nova_compute[189265]: 2025-09-30 07:18:13.654 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:13.676 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ec:ab:8a 10.100.0.7'], port_security=['fa:16:3e:ec:ab:8a 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '992a0681-bc5e-40b3-adf3-305eee0718fd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74ffbf65-ebbd-4587-bf5b-0b38421a4813', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1413b21c2db845e58d8a81f524a55f3a', 'neutron:revision_number': '5', 'neutron:security_group_ids': '8ad3c6f6-3842-4d69-92ac-cef07b75c3bc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7b541691-433c-426c-b8b7-10d79319603a, chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], tunnel_key=8, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], logical_port=a38c613b-6d8b-4dc3-96e4-4a4103b20d91) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:18:13 compute-0 nova_compute[189265]: 2025-09-30 07:18:13.683 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:13 compute-0 ovn_controller[91436]: 2025-09-30T07:18:13Z|00069|binding|INFO|Setting lport a38c613b-6d8b-4dc3-96e4-4a4103b20d91 ovn-installed in OVS
Sep 30 07:18:13 compute-0 ovn_controller[91436]: 2025-09-30T07:18:13Z|00070|binding|INFO|Setting lport a38c613b-6d8b-4dc3-96e4-4a4103b20d91 up in Southbound
Sep 30 07:18:13 compute-0 ovn_controller[91436]: 2025-09-30T07:18:13Z|00071|binding|INFO|Releasing lport a38c613b-6d8b-4dc3-96e4-4a4103b20d91 from this chassis (sb_readonly=1)
Sep 30 07:18:13 compute-0 ovn_controller[91436]: 2025-09-30T07:18:13Z|00072|if_status|INFO|Not setting lport a38c613b-6d8b-4dc3-96e4-4a4103b20d91 down as sb is readonly
Sep 30 07:18:13 compute-0 ovn_controller[91436]: 2025-09-30T07:18:13Z|00073|binding|INFO|Removing iface tapa38c613b-6d ovn-installed in OVS
Sep 30 07:18:13 compute-0 nova_compute[189265]: 2025-09-30 07:18:13.686 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:13 compute-0 ovn_controller[91436]: 2025-09-30T07:18:13Z|00074|binding|INFO|Releasing lport a38c613b-6d8b-4dc3-96e4-4a4103b20d91 from this chassis (sb_readonly=0)
Sep 30 07:18:13 compute-0 ovn_controller[91436]: 2025-09-30T07:18:13Z|00075|binding|INFO|Setting lport a38c613b-6d8b-4dc3-96e4-4a4103b20d91 down in Southbound
Sep 30 07:18:13 compute-0 nova_compute[189265]: 2025-09-30 07:18:13.705 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:13 compute-0 nova_compute[189265]: 2025-09-30 07:18:13.707 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:13.707 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap74ffbf65-e0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:18:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:13.707 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:18:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:13.708 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap74ffbf65-e0, col_values=(('external_ids', {'iface-id': '0c700e20-e593-4a77-93d7-fc919dc1f294'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:18:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:13.708 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:18:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:13.709 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[477109ce-76b5-4f79-8418-c01d1e18cb08]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-74ffbf65-ebbd-4587-bf5b-0b38421a4813\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/74ffbf65-ebbd-4587-bf5b-0b38421a4813.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 74ffbf65-ebbd-4587-bf5b-0b38421a4813\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:18:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:13.711 100322 INFO neutron.agent.ovn.metadata.agent [-] Port a38c613b-6d8b-4dc3-96e4-4a4103b20d91 in datapath 74ffbf65-ebbd-4587-bf5b-0b38421a4813 unbound from our chassis
Sep 30 07:18:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:13.715 100322 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 74ffbf65-ebbd-4587-bf5b-0b38421a4813
Sep 30 07:18:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:13.717 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ec:ab:8a 10.100.0.7'], port_security=['fa:16:3e:ec:ab:8a 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '992a0681-bc5e-40b3-adf3-305eee0718fd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74ffbf65-ebbd-4587-bf5b-0b38421a4813', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1413b21c2db845e58d8a81f524a55f3a', 'neutron:revision_number': '5', 'neutron:security_group_ids': '8ad3c6f6-3842-4d69-92ac-cef07b75c3bc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7b541691-433c-426c-b8b7-10d79319603a, chassis=[], tunnel_key=8, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], logical_port=a38c613b-6d8b-4dc3-96e4-4a4103b20d91) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:18:13 compute-0 nova_compute[189265]: 2025-09-30 07:18:13.728 2 INFO nova.virt.libvirt.driver [-] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Instance destroyed successfully.
Sep 30 07:18:13 compute-0 nova_compute[189265]: 2025-09-30 07:18:13.728 2 DEBUG nova.objects.instance [None req-1699a42a-3ed3-4a34-ad59-ab7d3e5dbbb4 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lazy-loading 'resources' on Instance uuid 992a0681-bc5e-40b3-adf3-305eee0718fd obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:18:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:13.740 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[f8a1e319-fb96-4ba3-b76d-78c423cc6c7b]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:18:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:13.771 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[ff471c4b-4a05-41e3-9ee7-11a1283a92c1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:18:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:13.773 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[02a6c340-3caf-431c-b431-8eba3217b315]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:18:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:13.807 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[882d6db8-5aa5-45f4-b8ee-ea69c1e7311d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:18:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:13.830 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[2d9b3580-ea66-4984-82bc-0beffa5090b1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap74ffbf65-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1f:ef:af'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 35, 'tx_packets': 15, 'rx_bytes': 1966, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 35, 'tx_packets': 15, 'rx_bytes': 1966, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 434702, 'reachable_time': 35230, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214132, 'error': None, 'target': 'ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:18:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:13.851 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[af6c2e32-9aab-4bb6-9dff-3eb91d76d626]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap74ffbf65-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 434715, 'tstamp': 434715}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214133, 'error': None, 'target': 'ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap74ffbf65-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 434718, 'tstamp': 434718}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214133, 'error': None, 'target': 'ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:18:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:13.852 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap74ffbf65-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:18:13 compute-0 nova_compute[189265]: 2025-09-30 07:18:13.854 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:13 compute-0 nova_compute[189265]: 2025-09-30 07:18:13.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:13.859 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap74ffbf65-e0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:18:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:13.859 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:18:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:13.860 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap74ffbf65-e0, col_values=(('external_ids', {'iface-id': '0c700e20-e593-4a77-93d7-fc919dc1f294'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:18:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:13.861 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:18:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:13.862 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[688b9978-4eb6-4c21-9adf-91bd29f6ef27]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-74ffbf65-ebbd-4587-bf5b-0b38421a4813\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/74ffbf65-ebbd-4587-bf5b-0b38421a4813.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 74ffbf65-ebbd-4587-bf5b-0b38421a4813\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:18:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:13.863 100322 INFO neutron.agent.ovn.metadata.agent [-] Port a38c613b-6d8b-4dc3-96e4-4a4103b20d91 in datapath 74ffbf65-ebbd-4587-bf5b-0b38421a4813 unbound from our chassis
Sep 30 07:18:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:13.865 100322 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 74ffbf65-ebbd-4587-bf5b-0b38421a4813
Sep 30 07:18:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:13.881 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[f87504a6-fe4a-4e93-9a03-d42ae55c8702]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:18:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:13.910 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[0251f8fb-4807-4329-9f50-f56447d0d5ed]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:18:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:13.914 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[d4efd033-724a-4d68-83c8-9081d52d6381]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:18:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:13.947 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[065cc076-492a-4b1d-bb82-67cbec701295]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:18:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:13.966 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[9a24fca7-6676-453a-8144-0274b7ff2940]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap74ffbf65-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1f:ef:af'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 35, 'tx_packets': 17, 'rx_bytes': 1966, 'tx_bytes': 858, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 35, 'tx_packets': 17, 'rx_bytes': 1966, 'tx_bytes': 858, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 434702, 'reachable_time': 35230, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214140, 'error': None, 'target': 'ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:18:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:13.984 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[2cc5e5cf-ecb4-4d11-9ff7-6655c370f99c]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap74ffbf65-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 434715, 'tstamp': 434715}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214141, 'error': None, 'target': 'ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap74ffbf65-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 434718, 'tstamp': 434718}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214141, 'error': None, 'target': 'ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:18:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:13.985 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap74ffbf65-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:18:13 compute-0 nova_compute[189265]: 2025-09-30 07:18:13.987 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:13 compute-0 nova_compute[189265]: 2025-09-30 07:18:13.991 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:13.992 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap74ffbf65-e0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:18:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:13.992 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:18:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:13.993 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap74ffbf65-e0, col_values=(('external_ids', {'iface-id': '0c700e20-e593-4a77-93d7-fc919dc1f294'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:18:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:13.993 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:18:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:13.995 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[8e671d36-2888-4b33-9157-c4e783c6ca9f]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-74ffbf65-ebbd-4587-bf5b-0b38421a4813\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/74ffbf65-ebbd-4587-bf5b-0b38421a4813.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 74ffbf65-ebbd-4587-bf5b-0b38421a4813\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:18:14 compute-0 nova_compute[189265]: 2025-09-30 07:18:14.255 2 DEBUG nova.virt.libvirt.vif [None req-1699a42a-3ed3-4a34-ad59-ab7d3e5dbbb4 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-09-30T07:16:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1531701854',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1531701854',id=7,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T07:17:13Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1413b21c2db845e58d8a81f524a55f3a',ramdisk_id='',reservation_id='r-f20tme0f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-2061885601',owner_user_name='tempest-TestExecuteActionsViaActuator-2061885601-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T07:17:13Z,user_data=None,user_id='d6cb6be5d6fc407eb3abc1c7c70f5d77',uuid=992a0681-bc5e-40b3-adf3-305eee0718fd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a38c613b-6d8b-4dc3-96e4-4a4103b20d91", "address": "fa:16:3e:ec:ab:8a", "network": {"id": "74ffbf65-ebbd-4587-bf5b-0b38421a4813", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1315246804-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1dc2a906d2242f79ffab81c2cf3c4d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa38c613b-6d", "ovs_interfaceid": "a38c613b-6d8b-4dc3-96e4-4a4103b20d91", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 07:18:14 compute-0 nova_compute[189265]: 2025-09-30 07:18:14.256 2 DEBUG nova.network.os_vif_util [None req-1699a42a-3ed3-4a34-ad59-ab7d3e5dbbb4 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Converting VIF {"id": "a38c613b-6d8b-4dc3-96e4-4a4103b20d91", "address": "fa:16:3e:ec:ab:8a", "network": {"id": "74ffbf65-ebbd-4587-bf5b-0b38421a4813", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1315246804-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1dc2a906d2242f79ffab81c2cf3c4d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa38c613b-6d", "ovs_interfaceid": "a38c613b-6d8b-4dc3-96e4-4a4103b20d91", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:18:14 compute-0 nova_compute[189265]: 2025-09-30 07:18:14.257 2 DEBUG nova.network.os_vif_util [None req-1699a42a-3ed3-4a34-ad59-ab7d3e5dbbb4 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ec:ab:8a,bridge_name='br-int',has_traffic_filtering=True,id=a38c613b-6d8b-4dc3-96e4-4a4103b20d91,network=Network(74ffbf65-ebbd-4587-bf5b-0b38421a4813),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa38c613b-6d') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:18:14 compute-0 nova_compute[189265]: 2025-09-30 07:18:14.257 2 DEBUG os_vif [None req-1699a42a-3ed3-4a34-ad59-ab7d3e5dbbb4 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ec:ab:8a,bridge_name='br-int',has_traffic_filtering=True,id=a38c613b-6d8b-4dc3-96e4-4a4103b20d91,network=Network(74ffbf65-ebbd-4587-bf5b-0b38421a4813),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa38c613b-6d') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 07:18:14 compute-0 nova_compute[189265]: 2025-09-30 07:18:14.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:14 compute-0 nova_compute[189265]: 2025-09-30 07:18:14.260 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa38c613b-6d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:18:14 compute-0 nova_compute[189265]: 2025-09-30 07:18:14.263 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:14 compute-0 nova_compute[189265]: 2025-09-30 07:18:14.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:14 compute-0 nova_compute[189265]: 2025-09-30 07:18:14.265 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=c313f9cc-bcca-4194-b48d-4009c6e3a9e5) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:18:14 compute-0 nova_compute[189265]: 2025-09-30 07:18:14.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:14 compute-0 nova_compute[189265]: 2025-09-30 07:18:14.269 2 INFO os_vif [None req-1699a42a-3ed3-4a34-ad59-ab7d3e5dbbb4 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ec:ab:8a,bridge_name='br-int',has_traffic_filtering=True,id=a38c613b-6d8b-4dc3-96e4-4a4103b20d91,network=Network(74ffbf65-ebbd-4587-bf5b-0b38421a4813),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa38c613b-6d')
Sep 30 07:18:14 compute-0 nova_compute[189265]: 2025-09-30 07:18:14.269 2 INFO nova.virt.libvirt.driver [None req-1699a42a-3ed3-4a34-ad59-ab7d3e5dbbb4 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Deleting instance files /var/lib/nova/instances/992a0681-bc5e-40b3-adf3-305eee0718fd_del
Sep 30 07:18:14 compute-0 nova_compute[189265]: 2025-09-30 07:18:14.270 2 INFO nova.virt.libvirt.driver [None req-1699a42a-3ed3-4a34-ad59-ab7d3e5dbbb4 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Deletion of /var/lib/nova/instances/992a0681-bc5e-40b3-adf3-305eee0718fd_del complete
Sep 30 07:18:14 compute-0 nova_compute[189265]: 2025-09-30 07:18:14.313 2 DEBUG nova.compute.manager [req-e9978e83-438c-40f0-85b6-f5240c346244 req-c6d8fb34-ede6-4eeb-969b-7f1393e710df 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Received event network-vif-unplugged-a38c613b-6d8b-4dc3-96e4-4a4103b20d91 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:18:14 compute-0 nova_compute[189265]: 2025-09-30 07:18:14.313 2 DEBUG oslo_concurrency.lockutils [req-e9978e83-438c-40f0-85b6-f5240c346244 req-c6d8fb34-ede6-4eeb-969b-7f1393e710df 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "992a0681-bc5e-40b3-adf3-305eee0718fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:18:14 compute-0 nova_compute[189265]: 2025-09-30 07:18:14.314 2 DEBUG oslo_concurrency.lockutils [req-e9978e83-438c-40f0-85b6-f5240c346244 req-c6d8fb34-ede6-4eeb-969b-7f1393e710df 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "992a0681-bc5e-40b3-adf3-305eee0718fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:18:14 compute-0 nova_compute[189265]: 2025-09-30 07:18:14.314 2 DEBUG oslo_concurrency.lockutils [req-e9978e83-438c-40f0-85b6-f5240c346244 req-c6d8fb34-ede6-4eeb-969b-7f1393e710df 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "992a0681-bc5e-40b3-adf3-305eee0718fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:18:14 compute-0 nova_compute[189265]: 2025-09-30 07:18:14.315 2 DEBUG nova.compute.manager [req-e9978e83-438c-40f0-85b6-f5240c346244 req-c6d8fb34-ede6-4eeb-969b-7f1393e710df 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] No waiting events found dispatching network-vif-unplugged-a38c613b-6d8b-4dc3-96e4-4a4103b20d91 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:18:14 compute-0 nova_compute[189265]: 2025-09-30 07:18:14.315 2 DEBUG nova.compute.manager [req-e9978e83-438c-40f0-85b6-f5240c346244 req-c6d8fb34-ede6-4eeb-969b-7f1393e710df 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Received event network-vif-unplugged-a38c613b-6d8b-4dc3-96e4-4a4103b20d91 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:18:14 compute-0 nova_compute[189265]: 2025-09-30 07:18:14.781 2 INFO nova.compute.manager [None req-1699a42a-3ed3-4a34-ad59-ab7d3e5dbbb4 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Took 1.36 seconds to destroy the instance on the hypervisor.
Sep 30 07:18:14 compute-0 nova_compute[189265]: 2025-09-30 07:18:14.781 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-1699a42a-3ed3-4a34-ad59-ab7d3e5dbbb4 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 07:18:14 compute-0 nova_compute[189265]: 2025-09-30 07:18:14.782 2 DEBUG nova.compute.manager [-] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 07:18:14 compute-0 nova_compute[189265]: 2025-09-30 07:18:14.782 2 DEBUG nova.network.neutron [-] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 07:18:14 compute-0 nova_compute[189265]: 2025-09-30 07:18:14.782 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:18:15 compute-0 nova_compute[189265]: 2025-09-30 07:18:15.220 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:18:16 compute-0 nova_compute[189265]: 2025-09-30 07:18:16.061 2 DEBUG nova.network.neutron [-] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:18:16 compute-0 nova_compute[189265]: 2025-09-30 07:18:16.528 2 DEBUG nova.compute.manager [req-e74967df-de32-4982-a0a3-5eea91781769 req-e46fb298-8c82-45bf-8abb-cdd3c44aca61 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Received event network-vif-unplugged-a38c613b-6d8b-4dc3-96e4-4a4103b20d91 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:18:16 compute-0 nova_compute[189265]: 2025-09-30 07:18:16.529 2 DEBUG oslo_concurrency.lockutils [req-e74967df-de32-4982-a0a3-5eea91781769 req-e46fb298-8c82-45bf-8abb-cdd3c44aca61 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "992a0681-bc5e-40b3-adf3-305eee0718fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:18:16 compute-0 nova_compute[189265]: 2025-09-30 07:18:16.529 2 DEBUG oslo_concurrency.lockutils [req-e74967df-de32-4982-a0a3-5eea91781769 req-e46fb298-8c82-45bf-8abb-cdd3c44aca61 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "992a0681-bc5e-40b3-adf3-305eee0718fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:18:16 compute-0 nova_compute[189265]: 2025-09-30 07:18:16.530 2 DEBUG oslo_concurrency.lockutils [req-e74967df-de32-4982-a0a3-5eea91781769 req-e46fb298-8c82-45bf-8abb-cdd3c44aca61 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "992a0681-bc5e-40b3-adf3-305eee0718fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:18:16 compute-0 nova_compute[189265]: 2025-09-30 07:18:16.530 2 DEBUG nova.compute.manager [req-e74967df-de32-4982-a0a3-5eea91781769 req-e46fb298-8c82-45bf-8abb-cdd3c44aca61 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] No waiting events found dispatching network-vif-unplugged-a38c613b-6d8b-4dc3-96e4-4a4103b20d91 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:18:16 compute-0 nova_compute[189265]: 2025-09-30 07:18:16.531 2 DEBUG nova.compute.manager [req-e74967df-de32-4982-a0a3-5eea91781769 req-e46fb298-8c82-45bf-8abb-cdd3c44aca61 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Received event network-vif-unplugged-a38c613b-6d8b-4dc3-96e4-4a4103b20d91 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:18:16 compute-0 nova_compute[189265]: 2025-09-30 07:18:16.531 2 DEBUG nova.compute.manager [req-e74967df-de32-4982-a0a3-5eea91781769 req-e46fb298-8c82-45bf-8abb-cdd3c44aca61 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Received event network-vif-plugged-a38c613b-6d8b-4dc3-96e4-4a4103b20d91 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:18:16 compute-0 nova_compute[189265]: 2025-09-30 07:18:16.532 2 DEBUG oslo_concurrency.lockutils [req-e74967df-de32-4982-a0a3-5eea91781769 req-e46fb298-8c82-45bf-8abb-cdd3c44aca61 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "992a0681-bc5e-40b3-adf3-305eee0718fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:18:16 compute-0 nova_compute[189265]: 2025-09-30 07:18:16.533 2 DEBUG oslo_concurrency.lockutils [req-e74967df-de32-4982-a0a3-5eea91781769 req-e46fb298-8c82-45bf-8abb-cdd3c44aca61 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "992a0681-bc5e-40b3-adf3-305eee0718fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:18:16 compute-0 nova_compute[189265]: 2025-09-30 07:18:16.533 2 DEBUG oslo_concurrency.lockutils [req-e74967df-de32-4982-a0a3-5eea91781769 req-e46fb298-8c82-45bf-8abb-cdd3c44aca61 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "992a0681-bc5e-40b3-adf3-305eee0718fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:18:16 compute-0 nova_compute[189265]: 2025-09-30 07:18:16.534 2 DEBUG nova.compute.manager [req-e74967df-de32-4982-a0a3-5eea91781769 req-e46fb298-8c82-45bf-8abb-cdd3c44aca61 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] No waiting events found dispatching network-vif-plugged-a38c613b-6d8b-4dc3-96e4-4a4103b20d91 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:18:16 compute-0 nova_compute[189265]: 2025-09-30 07:18:16.534 2 WARNING nova.compute.manager [req-e74967df-de32-4982-a0a3-5eea91781769 req-e46fb298-8c82-45bf-8abb-cdd3c44aca61 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Received unexpected event network-vif-plugged-a38c613b-6d8b-4dc3-96e4-4a4103b20d91 for instance with vm_state active and task_state deleting.
Sep 30 07:18:16 compute-0 nova_compute[189265]: 2025-09-30 07:18:16.535 2 DEBUG nova.compute.manager [req-e74967df-de32-4982-a0a3-5eea91781769 req-e46fb298-8c82-45bf-8abb-cdd3c44aca61 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Received event network-vif-plugged-a38c613b-6d8b-4dc3-96e4-4a4103b20d91 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:18:16 compute-0 nova_compute[189265]: 2025-09-30 07:18:16.535 2 DEBUG oslo_concurrency.lockutils [req-e74967df-de32-4982-a0a3-5eea91781769 req-e46fb298-8c82-45bf-8abb-cdd3c44aca61 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "992a0681-bc5e-40b3-adf3-305eee0718fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:18:16 compute-0 nova_compute[189265]: 2025-09-30 07:18:16.536 2 DEBUG oslo_concurrency.lockutils [req-e74967df-de32-4982-a0a3-5eea91781769 req-e46fb298-8c82-45bf-8abb-cdd3c44aca61 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "992a0681-bc5e-40b3-adf3-305eee0718fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:18:16 compute-0 nova_compute[189265]: 2025-09-30 07:18:16.536 2 DEBUG oslo_concurrency.lockutils [req-e74967df-de32-4982-a0a3-5eea91781769 req-e46fb298-8c82-45bf-8abb-cdd3c44aca61 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "992a0681-bc5e-40b3-adf3-305eee0718fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:18:16 compute-0 nova_compute[189265]: 2025-09-30 07:18:16.537 2 DEBUG nova.compute.manager [req-e74967df-de32-4982-a0a3-5eea91781769 req-e46fb298-8c82-45bf-8abb-cdd3c44aca61 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] No waiting events found dispatching network-vif-plugged-a38c613b-6d8b-4dc3-96e4-4a4103b20d91 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:18:16 compute-0 nova_compute[189265]: 2025-09-30 07:18:16.537 2 WARNING nova.compute.manager [req-e74967df-de32-4982-a0a3-5eea91781769 req-e46fb298-8c82-45bf-8abb-cdd3c44aca61 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Received unexpected event network-vif-plugged-a38c613b-6d8b-4dc3-96e4-4a4103b20d91 for instance with vm_state active and task_state deleting.
Sep 30 07:18:16 compute-0 nova_compute[189265]: 2025-09-30 07:18:16.538 2 DEBUG nova.compute.manager [req-e74967df-de32-4982-a0a3-5eea91781769 req-e46fb298-8c82-45bf-8abb-cdd3c44aca61 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Received event network-vif-unplugged-a38c613b-6d8b-4dc3-96e4-4a4103b20d91 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:18:16 compute-0 nova_compute[189265]: 2025-09-30 07:18:16.538 2 DEBUG oslo_concurrency.lockutils [req-e74967df-de32-4982-a0a3-5eea91781769 req-e46fb298-8c82-45bf-8abb-cdd3c44aca61 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "992a0681-bc5e-40b3-adf3-305eee0718fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:18:16 compute-0 nova_compute[189265]: 2025-09-30 07:18:16.539 2 DEBUG oslo_concurrency.lockutils [req-e74967df-de32-4982-a0a3-5eea91781769 req-e46fb298-8c82-45bf-8abb-cdd3c44aca61 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "992a0681-bc5e-40b3-adf3-305eee0718fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:18:16 compute-0 nova_compute[189265]: 2025-09-30 07:18:16.539 2 DEBUG oslo_concurrency.lockutils [req-e74967df-de32-4982-a0a3-5eea91781769 req-e46fb298-8c82-45bf-8abb-cdd3c44aca61 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "992a0681-bc5e-40b3-adf3-305eee0718fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:18:16 compute-0 nova_compute[189265]: 2025-09-30 07:18:16.540 2 DEBUG nova.compute.manager [req-e74967df-de32-4982-a0a3-5eea91781769 req-e46fb298-8c82-45bf-8abb-cdd3c44aca61 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] No waiting events found dispatching network-vif-unplugged-a38c613b-6d8b-4dc3-96e4-4a4103b20d91 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:18:16 compute-0 nova_compute[189265]: 2025-09-30 07:18:16.540 2 DEBUG nova.compute.manager [req-e74967df-de32-4982-a0a3-5eea91781769 req-e46fb298-8c82-45bf-8abb-cdd3c44aca61 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Received event network-vif-unplugged-a38c613b-6d8b-4dc3-96e4-4a4103b20d91 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:18:16 compute-0 nova_compute[189265]: 2025-09-30 07:18:16.541 2 DEBUG nova.compute.manager [req-e74967df-de32-4982-a0a3-5eea91781769 req-e46fb298-8c82-45bf-8abb-cdd3c44aca61 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Received event network-vif-unplugged-a38c613b-6d8b-4dc3-96e4-4a4103b20d91 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:18:16 compute-0 nova_compute[189265]: 2025-09-30 07:18:16.541 2 DEBUG oslo_concurrency.lockutils [req-e74967df-de32-4982-a0a3-5eea91781769 req-e46fb298-8c82-45bf-8abb-cdd3c44aca61 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "992a0681-bc5e-40b3-adf3-305eee0718fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:18:16 compute-0 nova_compute[189265]: 2025-09-30 07:18:16.542 2 DEBUG oslo_concurrency.lockutils [req-e74967df-de32-4982-a0a3-5eea91781769 req-e46fb298-8c82-45bf-8abb-cdd3c44aca61 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "992a0681-bc5e-40b3-adf3-305eee0718fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:18:16 compute-0 nova_compute[189265]: 2025-09-30 07:18:16.542 2 DEBUG oslo_concurrency.lockutils [req-e74967df-de32-4982-a0a3-5eea91781769 req-e46fb298-8c82-45bf-8abb-cdd3c44aca61 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "992a0681-bc5e-40b3-adf3-305eee0718fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:18:16 compute-0 nova_compute[189265]: 2025-09-30 07:18:16.542 2 DEBUG nova.compute.manager [req-e74967df-de32-4982-a0a3-5eea91781769 req-e46fb298-8c82-45bf-8abb-cdd3c44aca61 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] No waiting events found dispatching network-vif-unplugged-a38c613b-6d8b-4dc3-96e4-4a4103b20d91 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:18:16 compute-0 nova_compute[189265]: 2025-09-30 07:18:16.543 2 DEBUG nova.compute.manager [req-e74967df-de32-4982-a0a3-5eea91781769 req-e46fb298-8c82-45bf-8abb-cdd3c44aca61 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Received event network-vif-unplugged-a38c613b-6d8b-4dc3-96e4-4a4103b20d91 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:18:16 compute-0 nova_compute[189265]: 2025-09-30 07:18:16.543 2 DEBUG nova.compute.manager [req-e74967df-de32-4982-a0a3-5eea91781769 req-e46fb298-8c82-45bf-8abb-cdd3c44aca61 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Received event network-vif-deleted-a38c613b-6d8b-4dc3-96e4-4a4103b20d91 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:18:16 compute-0 nova_compute[189265]: 2025-09-30 07:18:16.570 2 INFO nova.compute.manager [-] [instance: 992a0681-bc5e-40b3-adf3-305eee0718fd] Took 1.79 seconds to deallocate network for instance.
Sep 30 07:18:16 compute-0 nova_compute[189265]: 2025-09-30 07:18:16.856 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:17 compute-0 nova_compute[189265]: 2025-09-30 07:18:17.240 2 DEBUG oslo_concurrency.lockutils [None req-1699a42a-3ed3-4a34-ad59-ab7d3e5dbbb4 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:18:17 compute-0 nova_compute[189265]: 2025-09-30 07:18:17.241 2 DEBUG oslo_concurrency.lockutils [None req-1699a42a-3ed3-4a34-ad59-ab7d3e5dbbb4 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:18:17 compute-0 nova_compute[189265]: 2025-09-30 07:18:17.385 2 DEBUG nova.compute.provider_tree [None req-1699a42a-3ed3-4a34-ad59-ab7d3e5dbbb4 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:18:17 compute-0 podman[214143]: 2025-09-30 07:18:17.517232498 +0000 UTC m=+0.090023684 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-type=git, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, name=ubi9-minimal, release=1755695350, managed_by=edpm_ansible, io.openshift.expose-services=)
Sep 30 07:18:17 compute-0 nova_compute[189265]: 2025-09-30 07:18:17.901 2 DEBUG nova.scheduler.client.report [None req-1699a42a-3ed3-4a34-ad59-ab7d3e5dbbb4 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:18:18 compute-0 nova_compute[189265]: 2025-09-30 07:18:18.451 2 DEBUG oslo_concurrency.lockutils [None req-1699a42a-3ed3-4a34-ad59-ab7d3e5dbbb4 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.210s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:18:18 compute-0 nova_compute[189265]: 2025-09-30 07:18:18.570 2 INFO nova.scheduler.client.report [None req-1699a42a-3ed3-4a34-ad59-ab7d3e5dbbb4 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Deleted allocations for instance 992a0681-bc5e-40b3-adf3-305eee0718fd
Sep 30 07:18:19 compute-0 nova_compute[189265]: 2025-09-30 07:18:19.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:19 compute-0 nova_compute[189265]: 2025-09-30 07:18:19.806 2 DEBUG oslo_concurrency.lockutils [None req-1699a42a-3ed3-4a34-ad59-ab7d3e5dbbb4 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "992a0681-bc5e-40b3-adf3-305eee0718fd" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.926s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:18:20 compute-0 podman[214164]: 2025-09-30 07:18:20.498447393 +0000 UTC m=+0.075073298 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 07:18:20 compute-0 podman[214165]: 2025-09-30 07:18:20.51583283 +0000 UTC m=+0.097236204 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Sep 30 07:18:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:20.542 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:18:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:20.543 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:18:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:20.543 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:18:20 compute-0 nova_compute[189265]: 2025-09-30 07:18:20.913 2 DEBUG oslo_concurrency.lockutils [None req-61482242-1574-4dec-9530-5ce0ea61fe3f d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Acquiring lock "3ad1a338-1146-48fa-a1fb-579e9b577b6c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:18:20 compute-0 nova_compute[189265]: 2025-09-30 07:18:20.914 2 DEBUG oslo_concurrency.lockutils [None req-61482242-1574-4dec-9530-5ce0ea61fe3f d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "3ad1a338-1146-48fa-a1fb-579e9b577b6c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:18:20 compute-0 nova_compute[189265]: 2025-09-30 07:18:20.914 2 DEBUG oslo_concurrency.lockutils [None req-61482242-1574-4dec-9530-5ce0ea61fe3f d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Acquiring lock "3ad1a338-1146-48fa-a1fb-579e9b577b6c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:18:20 compute-0 nova_compute[189265]: 2025-09-30 07:18:20.914 2 DEBUG oslo_concurrency.lockutils [None req-61482242-1574-4dec-9530-5ce0ea61fe3f d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "3ad1a338-1146-48fa-a1fb-579e9b577b6c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:18:20 compute-0 nova_compute[189265]: 2025-09-30 07:18:20.914 2 DEBUG oslo_concurrency.lockutils [None req-61482242-1574-4dec-9530-5ce0ea61fe3f d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "3ad1a338-1146-48fa-a1fb-579e9b577b6c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:18:21 compute-0 nova_compute[189265]: 2025-09-30 07:18:21.004 2 INFO nova.compute.manager [None req-61482242-1574-4dec-9530-5ce0ea61fe3f d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] Terminating instance
Sep 30 07:18:21 compute-0 nova_compute[189265]: 2025-09-30 07:18:21.670 2 DEBUG nova.compute.manager [None req-61482242-1574-4dec-9530-5ce0ea61fe3f d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 07:18:21 compute-0 kernel: tapd4e03551-f8 (unregistering): left promiscuous mode
Sep 30 07:18:21 compute-0 NetworkManager[51813]: <info>  [1759216701.7039] device (tapd4e03551-f8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 07:18:21 compute-0 nova_compute[189265]: 2025-09-30 07:18:21.754 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:21 compute-0 ovn_controller[91436]: 2025-09-30T07:18:21Z|00076|binding|INFO|Releasing lport d4e03551-f8cd-4604-990f-8c855bea77fa from this chassis (sb_readonly=0)
Sep 30 07:18:21 compute-0 ovn_controller[91436]: 2025-09-30T07:18:21Z|00077|binding|INFO|Setting lport d4e03551-f8cd-4604-990f-8c855bea77fa down in Southbound
Sep 30 07:18:21 compute-0 ovn_controller[91436]: 2025-09-30T07:18:21Z|00078|binding|INFO|Removing iface tapd4e03551-f8 ovn-installed in OVS
Sep 30 07:18:21 compute-0 nova_compute[189265]: 2025-09-30 07:18:21.758 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:21 compute-0 nova_compute[189265]: 2025-09-30 07:18:21.773 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:21 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000006.scope: Deactivated successfully.
Sep 30 07:18:21 compute-0 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d00000006.scope: Consumed 12.751s CPU time.
Sep 30 07:18:21 compute-0 systemd-machined[149233]: Machine qemu-5-instance-00000006 terminated.
Sep 30 07:18:21 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:21.822 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c9:35:b6 10.100.0.12'], port_security=['fa:16:3e:c9:35:b6 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '3ad1a338-1146-48fa-a1fb-579e9b577b6c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74ffbf65-ebbd-4587-bf5b-0b38421a4813', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1413b21c2db845e58d8a81f524a55f3a', 'neutron:revision_number': '10', 'neutron:security_group_ids': '8ad3c6f6-3842-4d69-92ac-cef07b75c3bc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7b541691-433c-426c-b8b7-10d79319603a, chassis=[], tunnel_key=7, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], logical_port=d4e03551-f8cd-4604-990f-8c855bea77fa) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:18:21 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:21.823 100322 INFO neutron.agent.ovn.metadata.agent [-] Port d4e03551-f8cd-4604-990f-8c855bea77fa in datapath 74ffbf65-ebbd-4587-bf5b-0b38421a4813 unbound from our chassis
Sep 30 07:18:21 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:21.825 100322 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 74ffbf65-ebbd-4587-bf5b-0b38421a4813
Sep 30 07:18:21 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:21.845 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[0538fe21-3b86-43fd-bd76-f1d313962a97]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:18:21 compute-0 nova_compute[189265]: 2025-09-30 07:18:21.859 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:21 compute-0 podman[214212]: 2025-09-30 07:18:21.870895169 +0000 UTC m=+0.095917366 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 07:18:21 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:21.891 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[9469aed0-ea2d-4f91-a96f-8fab96014c81]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:18:21 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:21.894 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[8e08ae98-ca87-4aae-8bfc-4547472f8db6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:18:21 compute-0 nova_compute[189265]: 2025-09-30 07:18:21.894 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:21 compute-0 nova_compute[189265]: 2025-09-30 07:18:21.903 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:21 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:21.930 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[2f278a82-e6aa-4a8e-90ed-1ac1bfeb648a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:18:21 compute-0 nova_compute[189265]: 2025-09-30 07:18:21.938 2 INFO nova.virt.libvirt.driver [-] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] Instance destroyed successfully.
Sep 30 07:18:21 compute-0 nova_compute[189265]: 2025-09-30 07:18:21.939 2 DEBUG nova.objects.instance [None req-61482242-1574-4dec-9530-5ce0ea61fe3f d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lazy-loading 'resources' on Instance uuid 3ad1a338-1146-48fa-a1fb-579e9b577b6c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:18:21 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:21.950 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[8550815f-f247-4495-89fa-5ee73c8a13b7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap74ffbf65-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1f:ef:af'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 36, 'tx_packets': 19, 'rx_bytes': 2008, 'tx_bytes': 942, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 36, 'tx_packets': 19, 'rx_bytes': 2008, 'tx_bytes': 942, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 434702, 'reachable_time': 35230, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214256, 'error': None, 'target': 'ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:18:21 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:21.970 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[f46d7a8d-5151-4a15-8ae9-8cb7b3cf6fa1]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap74ffbf65-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 434715, 'tstamp': 434715}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214257, 'error': None, 'target': 'ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap74ffbf65-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 434718, 'tstamp': 434718}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214257, 'error': None, 'target': 'ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:18:21 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:21.972 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap74ffbf65-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:18:21 compute-0 nova_compute[189265]: 2025-09-30 07:18:21.974 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:21 compute-0 nova_compute[189265]: 2025-09-30 07:18:21.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:21 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:21.981 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap74ffbf65-e0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:18:21 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:21.982 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:18:21 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:21.982 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap74ffbf65-e0, col_values=(('external_ids', {'iface-id': '0c700e20-e593-4a77-93d7-fc919dc1f294'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:18:21 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:21.982 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:18:21 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:21.984 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[24c9ed5b-660f-4c4f-9979-0c0b6ca46491]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-74ffbf65-ebbd-4587-bf5b-0b38421a4813\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/74ffbf65-ebbd-4587-bf5b-0b38421a4813.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 74ffbf65-ebbd-4587-bf5b-0b38421a4813\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:18:22 compute-0 nova_compute[189265]: 2025-09-30 07:18:22.396 2 DEBUG nova.compute.manager [req-28530ed7-4974-432b-847d-0a329a155066 req-473d2741-47a9-4d0a-bd69-c5dd346eb8fa 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] Received event network-vif-unplugged-d4e03551-f8cd-4604-990f-8c855bea77fa external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:18:22 compute-0 nova_compute[189265]: 2025-09-30 07:18:22.397 2 DEBUG oslo_concurrency.lockutils [req-28530ed7-4974-432b-847d-0a329a155066 req-473d2741-47a9-4d0a-bd69-c5dd346eb8fa 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "3ad1a338-1146-48fa-a1fb-579e9b577b6c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:18:22 compute-0 nova_compute[189265]: 2025-09-30 07:18:22.398 2 DEBUG oslo_concurrency.lockutils [req-28530ed7-4974-432b-847d-0a329a155066 req-473d2741-47a9-4d0a-bd69-c5dd346eb8fa 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "3ad1a338-1146-48fa-a1fb-579e9b577b6c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:18:22 compute-0 nova_compute[189265]: 2025-09-30 07:18:22.399 2 DEBUG oslo_concurrency.lockutils [req-28530ed7-4974-432b-847d-0a329a155066 req-473d2741-47a9-4d0a-bd69-c5dd346eb8fa 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "3ad1a338-1146-48fa-a1fb-579e9b577b6c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:18:22 compute-0 nova_compute[189265]: 2025-09-30 07:18:22.399 2 DEBUG nova.compute.manager [req-28530ed7-4974-432b-847d-0a329a155066 req-473d2741-47a9-4d0a-bd69-c5dd346eb8fa 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] No waiting events found dispatching network-vif-unplugged-d4e03551-f8cd-4604-990f-8c855bea77fa pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:18:22 compute-0 nova_compute[189265]: 2025-09-30 07:18:22.400 2 DEBUG nova.compute.manager [req-28530ed7-4974-432b-847d-0a329a155066 req-473d2741-47a9-4d0a-bd69-c5dd346eb8fa 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] Received event network-vif-unplugged-d4e03551-f8cd-4604-990f-8c855bea77fa for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:18:22 compute-0 nova_compute[189265]: 2025-09-30 07:18:22.477 2 DEBUG nova.virt.libvirt.vif [None req-61482242-1574-4dec-9530-5ce0ea61fe3f d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-09-30T07:16:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1188797546',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1188797546',id=6,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T07:17:56Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1413b21c2db845e58d8a81f524a55f3a',ramdisk_id='',reservation_id='r-yngwu08y',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-2061885601',owner_user_name='tempest-TestExecuteActionsViaActuator-2061885601-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T07:18:08Z,user_data=None,user_id='d6cb6be5d6fc407eb3abc1c7c70f5d77',uuid=3ad1a338-1146-48fa-a1fb-579e9b577b6c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d4e03551-f8cd-4604-990f-8c855bea77fa", "address": "fa:16:3e:c9:35:b6", "network": {"id": "74ffbf65-ebbd-4587-bf5b-0b38421a4813", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1315246804-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1dc2a906d2242f79ffab81c2cf3c4d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4e03551-f8", "ovs_interfaceid": "d4e03551-f8cd-4604-990f-8c855bea77fa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 07:18:22 compute-0 nova_compute[189265]: 2025-09-30 07:18:22.477 2 DEBUG nova.network.os_vif_util [None req-61482242-1574-4dec-9530-5ce0ea61fe3f d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Converting VIF {"id": "d4e03551-f8cd-4604-990f-8c855bea77fa", "address": "fa:16:3e:c9:35:b6", "network": {"id": "74ffbf65-ebbd-4587-bf5b-0b38421a4813", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1315246804-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1dc2a906d2242f79ffab81c2cf3c4d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4e03551-f8", "ovs_interfaceid": "d4e03551-f8cd-4604-990f-8c855bea77fa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:18:22 compute-0 nova_compute[189265]: 2025-09-30 07:18:22.478 2 DEBUG nova.network.os_vif_util [None req-61482242-1574-4dec-9530-5ce0ea61fe3f d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c9:35:b6,bridge_name='br-int',has_traffic_filtering=True,id=d4e03551-f8cd-4604-990f-8c855bea77fa,network=Network(74ffbf65-ebbd-4587-bf5b-0b38421a4813),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4e03551-f8') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:18:22 compute-0 nova_compute[189265]: 2025-09-30 07:18:22.478 2 DEBUG os_vif [None req-61482242-1574-4dec-9530-5ce0ea61fe3f d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c9:35:b6,bridge_name='br-int',has_traffic_filtering=True,id=d4e03551-f8cd-4604-990f-8c855bea77fa,network=Network(74ffbf65-ebbd-4587-bf5b-0b38421a4813),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4e03551-f8') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 07:18:22 compute-0 nova_compute[189265]: 2025-09-30 07:18:22.479 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:22 compute-0 nova_compute[189265]: 2025-09-30 07:18:22.480 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd4e03551-f8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:18:22 compute-0 nova_compute[189265]: 2025-09-30 07:18:22.481 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:22 compute-0 nova_compute[189265]: 2025-09-30 07:18:22.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:22 compute-0 nova_compute[189265]: 2025-09-30 07:18:22.484 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:22 compute-0 nova_compute[189265]: 2025-09-30 07:18:22.484 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=a509e0ae-a3ab-4024-8f26-5034ec138084) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:18:22 compute-0 nova_compute[189265]: 2025-09-30 07:18:22.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:22 compute-0 nova_compute[189265]: 2025-09-30 07:18:22.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:18:22 compute-0 nova_compute[189265]: 2025-09-30 07:18:22.488 2 INFO os_vif [None req-61482242-1574-4dec-9530-5ce0ea61fe3f d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c9:35:b6,bridge_name='br-int',has_traffic_filtering=True,id=d4e03551-f8cd-4604-990f-8c855bea77fa,network=Network(74ffbf65-ebbd-4587-bf5b-0b38421a4813),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4e03551-f8')
Sep 30 07:18:22 compute-0 nova_compute[189265]: 2025-09-30 07:18:22.488 2 INFO nova.virt.libvirt.driver [None req-61482242-1574-4dec-9530-5ce0ea61fe3f d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] Deleting instance files /var/lib/nova/instances/3ad1a338-1146-48fa-a1fb-579e9b577b6c_del
Sep 30 07:18:22 compute-0 nova_compute[189265]: 2025-09-30 07:18:22.493 2 INFO nova.virt.libvirt.driver [None req-61482242-1574-4dec-9530-5ce0ea61fe3f d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] Deletion of /var/lib/nova/instances/3ad1a338-1146-48fa-a1fb-579e9b577b6c_del complete
Sep 30 07:18:23 compute-0 nova_compute[189265]: 2025-09-30 07:18:23.055 2 INFO nova.compute.manager [None req-61482242-1574-4dec-9530-5ce0ea61fe3f d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] Took 1.38 seconds to destroy the instance on the hypervisor.
Sep 30 07:18:23 compute-0 nova_compute[189265]: 2025-09-30 07:18:23.056 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-61482242-1574-4dec-9530-5ce0ea61fe3f d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 07:18:23 compute-0 nova_compute[189265]: 2025-09-30 07:18:23.056 2 DEBUG nova.compute.manager [-] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 07:18:23 compute-0 nova_compute[189265]: 2025-09-30 07:18:23.056 2 DEBUG nova.network.neutron [-] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 07:18:23 compute-0 nova_compute[189265]: 2025-09-30 07:18:23.057 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:18:23 compute-0 nova_compute[189265]: 2025-09-30 07:18:23.208 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:18:24 compute-0 nova_compute[189265]: 2025-09-30 07:18:24.320 2 DEBUG nova.network.neutron [-] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:18:24 compute-0 nova_compute[189265]: 2025-09-30 07:18:24.493 2 DEBUG nova.compute.manager [req-1699d856-7459-478b-bf8b-bfe6f8f48fed req-2502e634-5647-4ebf-97ff-5c8e721a55f0 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] Received event network-vif-unplugged-d4e03551-f8cd-4604-990f-8c855bea77fa external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:18:24 compute-0 nova_compute[189265]: 2025-09-30 07:18:24.493 2 DEBUG oslo_concurrency.lockutils [req-1699d856-7459-478b-bf8b-bfe6f8f48fed req-2502e634-5647-4ebf-97ff-5c8e721a55f0 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "3ad1a338-1146-48fa-a1fb-579e9b577b6c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:18:24 compute-0 nova_compute[189265]: 2025-09-30 07:18:24.493 2 DEBUG oslo_concurrency.lockutils [req-1699d856-7459-478b-bf8b-bfe6f8f48fed req-2502e634-5647-4ebf-97ff-5c8e721a55f0 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "3ad1a338-1146-48fa-a1fb-579e9b577b6c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:18:24 compute-0 nova_compute[189265]: 2025-09-30 07:18:24.494 2 DEBUG oslo_concurrency.lockutils [req-1699d856-7459-478b-bf8b-bfe6f8f48fed req-2502e634-5647-4ebf-97ff-5c8e721a55f0 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "3ad1a338-1146-48fa-a1fb-579e9b577b6c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:18:24 compute-0 nova_compute[189265]: 2025-09-30 07:18:24.494 2 DEBUG nova.compute.manager [req-1699d856-7459-478b-bf8b-bfe6f8f48fed req-2502e634-5647-4ebf-97ff-5c8e721a55f0 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] No waiting events found dispatching network-vif-unplugged-d4e03551-f8cd-4604-990f-8c855bea77fa pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:18:24 compute-0 nova_compute[189265]: 2025-09-30 07:18:24.494 2 DEBUG nova.compute.manager [req-1699d856-7459-478b-bf8b-bfe6f8f48fed req-2502e634-5647-4ebf-97ff-5c8e721a55f0 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] Received event network-vif-unplugged-d4e03551-f8cd-4604-990f-8c855bea77fa for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:18:24 compute-0 nova_compute[189265]: 2025-09-30 07:18:24.494 2 DEBUG nova.compute.manager [req-1699d856-7459-478b-bf8b-bfe6f8f48fed req-2502e634-5647-4ebf-97ff-5c8e721a55f0 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] Received event network-vif-deleted-d4e03551-f8cd-4604-990f-8c855bea77fa external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:18:24 compute-0 nova_compute[189265]: 2025-09-30 07:18:24.835 2 INFO nova.compute.manager [-] [instance: 3ad1a338-1146-48fa-a1fb-579e9b577b6c] Took 1.78 seconds to deallocate network for instance.
Sep 30 07:18:25 compute-0 nova_compute[189265]: 2025-09-30 07:18:25.353 2 DEBUG oslo_concurrency.lockutils [None req-61482242-1574-4dec-9530-5ce0ea61fe3f d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:18:25 compute-0 nova_compute[189265]: 2025-09-30 07:18:25.354 2 DEBUG oslo_concurrency.lockutils [None req-61482242-1574-4dec-9530-5ce0ea61fe3f d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:18:25 compute-0 nova_compute[189265]: 2025-09-30 07:18:25.359 2 DEBUG oslo_concurrency.lockutils [None req-61482242-1574-4dec-9530-5ce0ea61fe3f d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.005s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:18:25 compute-0 nova_compute[189265]: 2025-09-30 07:18:25.387 2 INFO nova.scheduler.client.report [None req-61482242-1574-4dec-9530-5ce0ea61fe3f d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Deleted allocations for instance 3ad1a338-1146-48fa-a1fb-579e9b577b6c
Sep 30 07:18:26 compute-0 nova_compute[189265]: 2025-09-30 07:18:26.504 2 DEBUG oslo_concurrency.lockutils [None req-61482242-1574-4dec-9530-5ce0ea61fe3f d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "3ad1a338-1146-48fa-a1fb-579e9b577b6c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.590s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:18:26 compute-0 nova_compute[189265]: 2025-09-30 07:18:26.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:27 compute-0 nova_compute[189265]: 2025-09-30 07:18:27.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:28 compute-0 nova_compute[189265]: 2025-09-30 07:18:28.457 2 DEBUG oslo_concurrency.lockutils [None req-cca4e902-ab4b-40ad-be35-53237ab01538 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Acquiring lock "d40a0fba-a20e-4dcf-a048-10d9e21c6cf6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:18:28 compute-0 nova_compute[189265]: 2025-09-30 07:18:28.458 2 DEBUG oslo_concurrency.lockutils [None req-cca4e902-ab4b-40ad-be35-53237ab01538 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "d40a0fba-a20e-4dcf-a048-10d9e21c6cf6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:18:28 compute-0 nova_compute[189265]: 2025-09-30 07:18:28.458 2 DEBUG oslo_concurrency.lockutils [None req-cca4e902-ab4b-40ad-be35-53237ab01538 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Acquiring lock "d40a0fba-a20e-4dcf-a048-10d9e21c6cf6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:18:28 compute-0 nova_compute[189265]: 2025-09-30 07:18:28.459 2 DEBUG oslo_concurrency.lockutils [None req-cca4e902-ab4b-40ad-be35-53237ab01538 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "d40a0fba-a20e-4dcf-a048-10d9e21c6cf6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:18:28 compute-0 nova_compute[189265]: 2025-09-30 07:18:28.459 2 DEBUG oslo_concurrency.lockutils [None req-cca4e902-ab4b-40ad-be35-53237ab01538 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "d40a0fba-a20e-4dcf-a048-10d9e21c6cf6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:18:28 compute-0 nova_compute[189265]: 2025-09-30 07:18:28.487 2 INFO nova.compute.manager [None req-cca4e902-ab4b-40ad-be35-53237ab01538 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Terminating instance
Sep 30 07:18:29 compute-0 nova_compute[189265]: 2025-09-30 07:18:29.032 2 DEBUG nova.compute.manager [None req-cca4e902-ab4b-40ad-be35-53237ab01538 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 07:18:29 compute-0 kernel: tapdd1a8613-e6 (unregistering): left promiscuous mode
Sep 30 07:18:29 compute-0 NetworkManager[51813]: <info>  [1759216709.0699] device (tapdd1a8613-e6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 07:18:29 compute-0 ovn_controller[91436]: 2025-09-30T07:18:29Z|00079|binding|INFO|Releasing lport dd1a8613-e62a-44c6-9960-a46776a2c059 from this chassis (sb_readonly=0)
Sep 30 07:18:29 compute-0 ovn_controller[91436]: 2025-09-30T07:18:29Z|00080|binding|INFO|Setting lport dd1a8613-e62a-44c6-9960-a46776a2c059 down in Southbound
Sep 30 07:18:29 compute-0 nova_compute[189265]: 2025-09-30 07:18:29.081 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:29 compute-0 ovn_controller[91436]: 2025-09-30T07:18:29Z|00081|binding|INFO|Removing iface tapdd1a8613-e6 ovn-installed in OVS
Sep 30 07:18:29 compute-0 nova_compute[189265]: 2025-09-30 07:18:29.092 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:29 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:29.109 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ac:12:53 10.100.0.6'], port_security=['fa:16:3e:ac:12:53 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'd40a0fba-a20e-4dcf-a048-10d9e21c6cf6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74ffbf65-ebbd-4587-bf5b-0b38421a4813', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1413b21c2db845e58d8a81f524a55f3a', 'neutron:revision_number': '5', 'neutron:security_group_ids': '8ad3c6f6-3842-4d69-92ac-cef07b75c3bc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7b541691-433c-426c-b8b7-10d79319603a, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], logical_port=dd1a8613-e62a-44c6-9960-a46776a2c059) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:18:29 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:29.112 100322 INFO neutron.agent.ovn.metadata.agent [-] Port dd1a8613-e62a-44c6-9960-a46776a2c059 in datapath 74ffbf65-ebbd-4587-bf5b-0b38421a4813 unbound from our chassis
Sep 30 07:18:29 compute-0 nova_compute[189265]: 2025-09-30 07:18:29.117 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:29 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:29.117 100322 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 74ffbf65-ebbd-4587-bf5b-0b38421a4813
Sep 30 07:18:29 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:29.140 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[7b9b2ee2-f242-4dfc-be9d-42d4b46f244a]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:18:29 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000005.scope: Deactivated successfully.
Sep 30 07:18:29 compute-0 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000005.scope: Consumed 17.433s CPU time.
Sep 30 07:18:29 compute-0 systemd-machined[149233]: Machine qemu-2-instance-00000005 terminated.
Sep 30 07:18:29 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:29.192 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[dfde89fe-49e5-43de-949a-5697e8e1fb4f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:18:29 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:29.195 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[7ab28680-da9b-4346-ac39-f000e1104e6b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:18:29 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:29.241 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[e753d207-e3e8-4eb2-b2cb-da390e3662bb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:18:29 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:29.271 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[b778a5b7-9597-4ff0-a06c-a29ed10f3d10]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap74ffbf65-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1f:ef:af'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 36, 'tx_packets': 21, 'rx_bytes': 2008, 'tx_bytes': 1026, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 36, 'tx_packets': 21, 'rx_bytes': 2008, 'tx_bytes': 1026, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 434702, 'reachable_time': 35230, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214272, 'error': None, 'target': 'ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:18:29 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:29.299 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[2cc7ab88-637c-4efa-9ef7-d35cafc8bb79]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap74ffbf65-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 434715, 'tstamp': 434715}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214283, 'error': None, 'target': 'ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap74ffbf65-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 434718, 'tstamp': 434718}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214283, 'error': None, 'target': 'ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:18:29 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:29.300 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap74ffbf65-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:18:29 compute-0 nova_compute[189265]: 2025-09-30 07:18:29.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:29 compute-0 nova_compute[189265]: 2025-09-30 07:18:29.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:29 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:29.308 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap74ffbf65-e0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:18:29 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:29.308 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:18:29 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:29.309 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap74ffbf65-e0, col_values=(('external_ids', {'iface-id': '0c700e20-e593-4a77-93d7-fc919dc1f294'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:18:29 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:29.309 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:18:29 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:29.310 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[7a66e23d-cede-4511-b839-53f2f5d8e62a]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-74ffbf65-ebbd-4587-bf5b-0b38421a4813\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/74ffbf65-ebbd-4587-bf5b-0b38421a4813.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 74ffbf65-ebbd-4587-bf5b-0b38421a4813\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:18:29 compute-0 nova_compute[189265]: 2025-09-30 07:18:29.323 2 INFO nova.virt.libvirt.driver [-] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Instance destroyed successfully.
Sep 30 07:18:29 compute-0 nova_compute[189265]: 2025-09-30 07:18:29.323 2 DEBUG nova.objects.instance [None req-cca4e902-ab4b-40ad-be35-53237ab01538 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lazy-loading 'resources' on Instance uuid d40a0fba-a20e-4dcf-a048-10d9e21c6cf6 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:18:29 compute-0 nova_compute[189265]: 2025-09-30 07:18:29.356 2 DEBUG nova.compute.manager [req-842bfefe-67d5-4f4e-b8fe-a15c2800bdb0 req-0b52ce0e-7b79-40d9-b370-c27a599e4176 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Received event network-vif-unplugged-dd1a8613-e62a-44c6-9960-a46776a2c059 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:18:29 compute-0 nova_compute[189265]: 2025-09-30 07:18:29.356 2 DEBUG oslo_concurrency.lockutils [req-842bfefe-67d5-4f4e-b8fe-a15c2800bdb0 req-0b52ce0e-7b79-40d9-b370-c27a599e4176 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "d40a0fba-a20e-4dcf-a048-10d9e21c6cf6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:18:29 compute-0 nova_compute[189265]: 2025-09-30 07:18:29.356 2 DEBUG oslo_concurrency.lockutils [req-842bfefe-67d5-4f4e-b8fe-a15c2800bdb0 req-0b52ce0e-7b79-40d9-b370-c27a599e4176 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "d40a0fba-a20e-4dcf-a048-10d9e21c6cf6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:18:29 compute-0 nova_compute[189265]: 2025-09-30 07:18:29.356 2 DEBUG oslo_concurrency.lockutils [req-842bfefe-67d5-4f4e-b8fe-a15c2800bdb0 req-0b52ce0e-7b79-40d9-b370-c27a599e4176 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "d40a0fba-a20e-4dcf-a048-10d9e21c6cf6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:18:29 compute-0 nova_compute[189265]: 2025-09-30 07:18:29.357 2 DEBUG nova.compute.manager [req-842bfefe-67d5-4f4e-b8fe-a15c2800bdb0 req-0b52ce0e-7b79-40d9-b370-c27a599e4176 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] No waiting events found dispatching network-vif-unplugged-dd1a8613-e62a-44c6-9960-a46776a2c059 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:18:29 compute-0 nova_compute[189265]: 2025-09-30 07:18:29.357 2 DEBUG nova.compute.manager [req-842bfefe-67d5-4f4e-b8fe-a15c2800bdb0 req-0b52ce0e-7b79-40d9-b370-c27a599e4176 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Received event network-vif-unplugged-dd1a8613-e62a-44c6-9960-a46776a2c059 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:18:29 compute-0 podman[199733]: time="2025-09-30T07:18:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:18:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:18:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 07:18:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:18:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3465 "" "Go-http-client/1.1"
Sep 30 07:18:29 compute-0 nova_compute[189265]: 2025-09-30 07:18:29.851 2 DEBUG nova.virt.libvirt.vif [None req-cca4e902-ab4b-40ad-be35-53237ab01538 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-09-30T07:16:04Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1824333052',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1824333052',id=5,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T07:16:24Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1413b21c2db845e58d8a81f524a55f3a',ramdisk_id='',reservation_id='r-o0ztmckl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-2061885601',owner_user_name='tempest-TestExecuteActionsViaActuator-2061885601-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T07:16:24Z,user_data=None,user_id='d6cb6be5d6fc407eb3abc1c7c70f5d77',uuid=d40a0fba-a20e-4dcf-a048-10d9e21c6cf6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "dd1a8613-e62a-44c6-9960-a46776a2c059", "address": "fa:16:3e:ac:12:53", "network": {"id": "74ffbf65-ebbd-4587-bf5b-0b38421a4813", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1315246804-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1dc2a906d2242f79ffab81c2cf3c4d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd1a8613-e6", "ovs_interfaceid": "dd1a8613-e62a-44c6-9960-a46776a2c059", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 07:18:29 compute-0 nova_compute[189265]: 2025-09-30 07:18:29.852 2 DEBUG nova.network.os_vif_util [None req-cca4e902-ab4b-40ad-be35-53237ab01538 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Converting VIF {"id": "dd1a8613-e62a-44c6-9960-a46776a2c059", "address": "fa:16:3e:ac:12:53", "network": {"id": "74ffbf65-ebbd-4587-bf5b-0b38421a4813", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1315246804-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1dc2a906d2242f79ffab81c2cf3c4d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd1a8613-e6", "ovs_interfaceid": "dd1a8613-e62a-44c6-9960-a46776a2c059", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:18:29 compute-0 nova_compute[189265]: 2025-09-30 07:18:29.853 2 DEBUG nova.network.os_vif_util [None req-cca4e902-ab4b-40ad-be35-53237ab01538 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ac:12:53,bridge_name='br-int',has_traffic_filtering=True,id=dd1a8613-e62a-44c6-9960-a46776a2c059,network=Network(74ffbf65-ebbd-4587-bf5b-0b38421a4813),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd1a8613-e6') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:18:29 compute-0 nova_compute[189265]: 2025-09-30 07:18:29.853 2 DEBUG os_vif [None req-cca4e902-ab4b-40ad-be35-53237ab01538 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:12:53,bridge_name='br-int',has_traffic_filtering=True,id=dd1a8613-e62a-44c6-9960-a46776a2c059,network=Network(74ffbf65-ebbd-4587-bf5b-0b38421a4813),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd1a8613-e6') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 07:18:29 compute-0 nova_compute[189265]: 2025-09-30 07:18:29.854 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:29 compute-0 nova_compute[189265]: 2025-09-30 07:18:29.855 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdd1a8613-e6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:18:29 compute-0 nova_compute[189265]: 2025-09-30 07:18:29.856 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:29 compute-0 nova_compute[189265]: 2025-09-30 07:18:29.857 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:29 compute-0 nova_compute[189265]: 2025-09-30 07:18:29.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:29 compute-0 nova_compute[189265]: 2025-09-30 07:18:29.858 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=621e1e7e-3789-4f4a-a676-70cbb85a0ddc) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:18:29 compute-0 nova_compute[189265]: 2025-09-30 07:18:29.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:29 compute-0 nova_compute[189265]: 2025-09-30 07:18:29.859 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:29 compute-0 nova_compute[189265]: 2025-09-30 07:18:29.861 2 INFO os_vif [None req-cca4e902-ab4b-40ad-be35-53237ab01538 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ac:12:53,bridge_name='br-int',has_traffic_filtering=True,id=dd1a8613-e62a-44c6-9960-a46776a2c059,network=Network(74ffbf65-ebbd-4587-bf5b-0b38421a4813),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd1a8613-e6')
Sep 30 07:18:29 compute-0 nova_compute[189265]: 2025-09-30 07:18:29.861 2 INFO nova.virt.libvirt.driver [None req-cca4e902-ab4b-40ad-be35-53237ab01538 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Deleting instance files /var/lib/nova/instances/d40a0fba-a20e-4dcf-a048-10d9e21c6cf6_del
Sep 30 07:18:29 compute-0 nova_compute[189265]: 2025-09-30 07:18:29.862 2 INFO nova.virt.libvirt.driver [None req-cca4e902-ab4b-40ad-be35-53237ab01538 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Deletion of /var/lib/nova/instances/d40a0fba-a20e-4dcf-a048-10d9e21c6cf6_del complete
Sep 30 07:18:30 compute-0 nova_compute[189265]: 2025-09-30 07:18:30.381 2 INFO nova.compute.manager [None req-cca4e902-ab4b-40ad-be35-53237ab01538 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Took 1.35 seconds to destroy the instance on the hypervisor.
Sep 30 07:18:30 compute-0 nova_compute[189265]: 2025-09-30 07:18:30.382 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-cca4e902-ab4b-40ad-be35-53237ab01538 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 07:18:30 compute-0 nova_compute[189265]: 2025-09-30 07:18:30.382 2 DEBUG nova.compute.manager [-] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 07:18:30 compute-0 nova_compute[189265]: 2025-09-30 07:18:30.382 2 DEBUG nova.network.neutron [-] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 07:18:30 compute-0 nova_compute[189265]: 2025-09-30 07:18:30.383 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:18:31 compute-0 nova_compute[189265]: 2025-09-30 07:18:31.243 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:18:31 compute-0 nova_compute[189265]: 2025-09-30 07:18:31.414 2 DEBUG nova.compute.manager [req-68abaf2d-28a8-450c-9c9b-9babc27a8533 req-f7f6e4ee-5a42-45be-bfbd-1749c7a68d00 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Received event network-vif-unplugged-dd1a8613-e62a-44c6-9960-a46776a2c059 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:18:31 compute-0 nova_compute[189265]: 2025-09-30 07:18:31.415 2 DEBUG oslo_concurrency.lockutils [req-68abaf2d-28a8-450c-9c9b-9babc27a8533 req-f7f6e4ee-5a42-45be-bfbd-1749c7a68d00 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "d40a0fba-a20e-4dcf-a048-10d9e21c6cf6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:18:31 compute-0 nova_compute[189265]: 2025-09-30 07:18:31.415 2 DEBUG oslo_concurrency.lockutils [req-68abaf2d-28a8-450c-9c9b-9babc27a8533 req-f7f6e4ee-5a42-45be-bfbd-1749c7a68d00 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "d40a0fba-a20e-4dcf-a048-10d9e21c6cf6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:18:31 compute-0 nova_compute[189265]: 2025-09-30 07:18:31.415 2 DEBUG oslo_concurrency.lockutils [req-68abaf2d-28a8-450c-9c9b-9babc27a8533 req-f7f6e4ee-5a42-45be-bfbd-1749c7a68d00 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "d40a0fba-a20e-4dcf-a048-10d9e21c6cf6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:18:31 compute-0 nova_compute[189265]: 2025-09-30 07:18:31.415 2 DEBUG nova.compute.manager [req-68abaf2d-28a8-450c-9c9b-9babc27a8533 req-f7f6e4ee-5a42-45be-bfbd-1749c7a68d00 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] No waiting events found dispatching network-vif-unplugged-dd1a8613-e62a-44c6-9960-a46776a2c059 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:18:31 compute-0 nova_compute[189265]: 2025-09-30 07:18:31.415 2 DEBUG nova.compute.manager [req-68abaf2d-28a8-450c-9c9b-9babc27a8533 req-f7f6e4ee-5a42-45be-bfbd-1749c7a68d00 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Received event network-vif-unplugged-dd1a8613-e62a-44c6-9960-a46776a2c059 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:18:31 compute-0 openstack_network_exporter[201859]: ERROR   07:18:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:18:31 compute-0 openstack_network_exporter[201859]: ERROR   07:18:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:18:31 compute-0 openstack_network_exporter[201859]: ERROR   07:18:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:18:31 compute-0 openstack_network_exporter[201859]: ERROR   07:18:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:18:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:18:31 compute-0 openstack_network_exporter[201859]: ERROR   07:18:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:18:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:18:31 compute-0 nova_compute[189265]: 2025-09-30 07:18:31.783 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:18:31 compute-0 nova_compute[189265]: 2025-09-30 07:18:31.866 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:32 compute-0 nova_compute[189265]: 2025-09-30 07:18:32.776 2 DEBUG nova.network.neutron [-] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:18:32 compute-0 nova_compute[189265]: 2025-09-30 07:18:32.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:18:33 compute-0 nova_compute[189265]: 2025-09-30 07:18:33.284 2 INFO nova.compute.manager [-] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Took 2.90 seconds to deallocate network for instance.
Sep 30 07:18:33 compute-0 nova_compute[189265]: 2025-09-30 07:18:33.529 2 DEBUG nova.compute.manager [req-9d2a2b01-41c9-49a9-9b38-3f7bb425df59 req-8fcfd279-d1d5-425e-afd7-96e32d70962a 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d40a0fba-a20e-4dcf-a048-10d9e21c6cf6] Received event network-vif-deleted-dd1a8613-e62a-44c6-9960-a46776a2c059 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:18:33 compute-0 nova_compute[189265]: 2025-09-30 07:18:33.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:18:33 compute-0 nova_compute[189265]: 2025-09-30 07:18:33.881 2 DEBUG oslo_concurrency.lockutils [None req-cca4e902-ab4b-40ad-be35-53237ab01538 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:18:33 compute-0 nova_compute[189265]: 2025-09-30 07:18:33.882 2 DEBUG oslo_concurrency.lockutils [None req-cca4e902-ab4b-40ad-be35-53237ab01538 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:18:33 compute-0 nova_compute[189265]: 2025-09-30 07:18:33.958 2 DEBUG nova.compute.provider_tree [None req-cca4e902-ab4b-40ad-be35-53237ab01538 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:18:34 compute-0 nova_compute[189265]: 2025-09-30 07:18:34.469 2 DEBUG nova.scheduler.client.report [None req-cca4e902-ab4b-40ad-be35-53237ab01538 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:18:34 compute-0 nova_compute[189265]: 2025-09-30 07:18:34.860 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:34 compute-0 nova_compute[189265]: 2025-09-30 07:18:34.980 2 DEBUG oslo_concurrency.lockutils [None req-cca4e902-ab4b-40ad-be35-53237ab01538 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.098s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:18:35 compute-0 nova_compute[189265]: 2025-09-30 07:18:35.060 2 INFO nova.scheduler.client.report [None req-cca4e902-ab4b-40ad-be35-53237ab01538 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Deleted allocations for instance d40a0fba-a20e-4dcf-a048-10d9e21c6cf6
Sep 30 07:18:35 compute-0 podman[214291]: 2025-09-30 07:18:35.514740999 +0000 UTC m=+0.081881886 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 07:18:35 compute-0 nova_compute[189265]: 2025-09-30 07:18:35.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:18:35 compute-0 nova_compute[189265]: 2025-09-30 07:18:35.789 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:18:35 compute-0 nova_compute[189265]: 2025-09-30 07:18:35.789 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:18:35 compute-0 nova_compute[189265]: 2025-09-30 07:18:35.790 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:18:36 compute-0 nova_compute[189265]: 2025-09-30 07:18:36.187 2 DEBUG oslo_concurrency.lockutils [None req-cca4e902-ab4b-40ad-be35-53237ab01538 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "d40a0fba-a20e-4dcf-a048-10d9e21c6cf6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.730s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:18:36 compute-0 nova_compute[189265]: 2025-09-30 07:18:36.868 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:37 compute-0 nova_compute[189265]: 2025-09-30 07:18:37.977 2 DEBUG oslo_concurrency.lockutils [None req-ae0ab4e4-5a88-48fe-b264-f9aa76886ff5 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Acquiring lock "a62dd947-c757-461c-9dd7-2ccd8c8daf8c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:18:37 compute-0 nova_compute[189265]: 2025-09-30 07:18:37.977 2 DEBUG oslo_concurrency.lockutils [None req-ae0ab4e4-5a88-48fe-b264-f9aa76886ff5 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "a62dd947-c757-461c-9dd7-2ccd8c8daf8c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:18:37 compute-0 nova_compute[189265]: 2025-09-30 07:18:37.978 2 DEBUG oslo_concurrency.lockutils [None req-ae0ab4e4-5a88-48fe-b264-f9aa76886ff5 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Acquiring lock "a62dd947-c757-461c-9dd7-2ccd8c8daf8c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:18:37 compute-0 nova_compute[189265]: 2025-09-30 07:18:37.978 2 DEBUG oslo_concurrency.lockutils [None req-ae0ab4e4-5a88-48fe-b264-f9aa76886ff5 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "a62dd947-c757-461c-9dd7-2ccd8c8daf8c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:18:37 compute-0 nova_compute[189265]: 2025-09-30 07:18:37.978 2 DEBUG oslo_concurrency.lockutils [None req-ae0ab4e4-5a88-48fe-b264-f9aa76886ff5 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "a62dd947-c757-461c-9dd7-2ccd8c8daf8c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:18:37 compute-0 nova_compute[189265]: 2025-09-30 07:18:37.988 2 INFO nova.compute.manager [None req-ae0ab4e4-5a88-48fe-b264-f9aa76886ff5 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: a62dd947-c757-461c-9dd7-2ccd8c8daf8c] Terminating instance
Sep 30 07:18:38 compute-0 nova_compute[189265]: 2025-09-30 07:18:38.294 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:18:38 compute-0 nova_compute[189265]: 2025-09-30 07:18:38.295 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:18:38 compute-0 nova_compute[189265]: 2025-09-30 07:18:38.504 2 DEBUG nova.compute.manager [None req-ae0ab4e4-5a88-48fe-b264-f9aa76886ff5 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: a62dd947-c757-461c-9dd7-2ccd8c8daf8c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 07:18:38 compute-0 kernel: tap50e9f0fc-d5 (unregistering): left promiscuous mode
Sep 30 07:18:38 compute-0 NetworkManager[51813]: <info>  [1759216718.5305] device (tap50e9f0fc-d5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 07:18:38 compute-0 ovn_controller[91436]: 2025-09-30T07:18:38Z|00082|binding|INFO|Releasing lport 50e9f0fc-d5c3-4230-aea5-ef47736ac58f from this chassis (sb_readonly=0)
Sep 30 07:18:38 compute-0 ovn_controller[91436]: 2025-09-30T07:18:38Z|00083|binding|INFO|Setting lport 50e9f0fc-d5c3-4230-aea5-ef47736ac58f down in Southbound
Sep 30 07:18:38 compute-0 nova_compute[189265]: 2025-09-30 07:18:38.539 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:38 compute-0 ovn_controller[91436]: 2025-09-30T07:18:38Z|00084|binding|INFO|Removing iface tap50e9f0fc-d5 ovn-installed in OVS
Sep 30 07:18:38 compute-0 nova_compute[189265]: 2025-09-30 07:18:38.542 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:38 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:38.548 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d0:55:ec 10.100.0.13'], port_security=['fa:16:3e:d0:55:ec 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'a62dd947-c757-461c-9dd7-2ccd8c8daf8c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74ffbf65-ebbd-4587-bf5b-0b38421a4813', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1413b21c2db845e58d8a81f524a55f3a', 'neutron:revision_number': '14', 'neutron:security_group_ids': '8ad3c6f6-3842-4d69-92ac-cef07b75c3bc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7b541691-433c-426c-b8b7-10d79319603a, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], logical_port=50e9f0fc-d5c3-4230-aea5-ef47736ac58f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:18:38 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:38.549 100322 INFO neutron.agent.ovn.metadata.agent [-] Port 50e9f0fc-d5c3-4230-aea5-ef47736ac58f in datapath 74ffbf65-ebbd-4587-bf5b-0b38421a4813 unbound from our chassis
Sep 30 07:18:38 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:38.552 100322 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 74ffbf65-ebbd-4587-bf5b-0b38421a4813
Sep 30 07:18:38 compute-0 nova_compute[189265]: 2025-09-30 07:18:38.558 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:38 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:38.571 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[0afe8c8e-9272-4ac5-b400-068ee167023a]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:18:38 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000004.scope: Deactivated successfully.
Sep 30 07:18:38 compute-0 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000004.scope: Consumed 4.353s CPU time.
Sep 30 07:18:38 compute-0 systemd-machined[149233]: Machine qemu-4-instance-00000004 terminated.
Sep 30 07:18:38 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:38.609 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[866b3bef-678c-4b70-a50d-714880dce718]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:18:38 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:38.612 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[a71e0550-f534-4285-9557-b3137ea0ce9a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:18:38 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:38.645 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[79dc50eb-5e28-4858-99b1-6787eaad27ed]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:18:38 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:38.662 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[624d5bd1-8d18-478e-be01-d904dd3772e0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap74ffbf65-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1f:ef:af'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 36, 'tx_packets': 23, 'rx_bytes': 2008, 'tx_bytes': 1110, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 36, 'tx_packets': 23, 'rx_bytes': 2008, 'tx_bytes': 1110, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 434702, 'reachable_time': 35230, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214327, 'error': None, 'target': 'ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:18:38 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:38.678 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[a9e8d43d-921c-4a56-9b04-1714a699b7f4]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap74ffbf65-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 434715, 'tstamp': 434715}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214328, 'error': None, 'target': 'ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap74ffbf65-e1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 434718, 'tstamp': 434718}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214328, 'error': None, 'target': 'ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:18:38 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:38.678 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap74ffbf65-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:18:38 compute-0 nova_compute[189265]: 2025-09-30 07:18:38.680 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:38 compute-0 nova_compute[189265]: 2025-09-30 07:18:38.684 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:38 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:38.685 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap74ffbf65-e0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:18:38 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:38.685 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:18:38 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:38.685 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap74ffbf65-e0, col_values=(('external_ids', {'iface-id': '0c700e20-e593-4a77-93d7-fc919dc1f294'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:18:38 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:38.685 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:18:38 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:38.686 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[ccfc172b-8fcf-4631-92d1-adce10861afd]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-74ffbf65-ebbd-4587-bf5b-0b38421a4813\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/74ffbf65-ebbd-4587-bf5b-0b38421a4813.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 74ffbf65-ebbd-4587-bf5b-0b38421a4813\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:18:38 compute-0 nova_compute[189265]: 2025-09-30 07:18:38.729 2 DEBUG nova.compute.manager [req-d11b8a99-3f89-4c85-b187-ea408e1be845 req-9b248251-df1d-433a-8208-9e2abd7364bf 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: a62dd947-c757-461c-9dd7-2ccd8c8daf8c] Received event network-vif-unplugged-50e9f0fc-d5c3-4230-aea5-ef47736ac58f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:18:38 compute-0 nova_compute[189265]: 2025-09-30 07:18:38.730 2 DEBUG oslo_concurrency.lockutils [req-d11b8a99-3f89-4c85-b187-ea408e1be845 req-9b248251-df1d-433a-8208-9e2abd7364bf 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "a62dd947-c757-461c-9dd7-2ccd8c8daf8c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:18:38 compute-0 nova_compute[189265]: 2025-09-30 07:18:38.730 2 DEBUG oslo_concurrency.lockutils [req-d11b8a99-3f89-4c85-b187-ea408e1be845 req-9b248251-df1d-433a-8208-9e2abd7364bf 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "a62dd947-c757-461c-9dd7-2ccd8c8daf8c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:18:38 compute-0 nova_compute[189265]: 2025-09-30 07:18:38.730 2 DEBUG oslo_concurrency.lockutils [req-d11b8a99-3f89-4c85-b187-ea408e1be845 req-9b248251-df1d-433a-8208-9e2abd7364bf 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "a62dd947-c757-461c-9dd7-2ccd8c8daf8c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:18:38 compute-0 nova_compute[189265]: 2025-09-30 07:18:38.731 2 DEBUG nova.compute.manager [req-d11b8a99-3f89-4c85-b187-ea408e1be845 req-9b248251-df1d-433a-8208-9e2abd7364bf 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: a62dd947-c757-461c-9dd7-2ccd8c8daf8c] No waiting events found dispatching network-vif-unplugged-50e9f0fc-d5c3-4230-aea5-ef47736ac58f pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:18:38 compute-0 nova_compute[189265]: 2025-09-30 07:18:38.731 2 DEBUG nova.compute.manager [req-d11b8a99-3f89-4c85-b187-ea408e1be845 req-9b248251-df1d-433a-8208-9e2abd7364bf 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: a62dd947-c757-461c-9dd7-2ccd8c8daf8c] Received event network-vif-unplugged-50e9f0fc-d5c3-4230-aea5-ef47736ac58f for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:18:38 compute-0 nova_compute[189265]: 2025-09-30 07:18:38.761 2 INFO nova.virt.libvirt.driver [-] [instance: a62dd947-c757-461c-9dd7-2ccd8c8daf8c] Instance destroyed successfully.
Sep 30 07:18:38 compute-0 nova_compute[189265]: 2025-09-30 07:18:38.762 2 DEBUG nova.objects.instance [None req-ae0ab4e4-5a88-48fe-b264-f9aa76886ff5 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lazy-loading 'resources' on Instance uuid a62dd947-c757-461c-9dd7-2ccd8c8daf8c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:18:38 compute-0 nova_compute[189265]: 2025-09-30 07:18:38.807 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:18:38 compute-0 nova_compute[189265]: 2025-09-30 07:18:38.807 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:18:38 compute-0 nova_compute[189265]: 2025-09-30 07:18:38.808 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:18:38 compute-0 nova_compute[189265]: 2025-09-30 07:18:38.808 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:18:39 compute-0 nova_compute[189265]: 2025-09-30 07:18:39.268 2 DEBUG nova.virt.libvirt.vif [None req-ae0ab4e4-5a88-48fe-b264-f9aa76886ff5 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-09-30T07:15:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-653610675',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-653610675',id=4,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T07:16:00Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1413b21c2db845e58d8a81f524a55f3a',ramdisk_id='',reservation_id='r-rahezju2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',clean_attempts='1',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-2061885601',owner_user_name='tempest-TestExecuteActionsViaActuator-2061885601-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T07:17:58Z,user_data=None,user_id='d6cb6be5d6fc407eb3abc1c7c70f5d77',uuid=a62dd947-c757-461c-9dd7-2ccd8c8daf8c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "50e9f0fc-d5c3-4230-aea5-ef47736ac58f", "address": "fa:16:3e:d0:55:ec", "network": {"id": "74ffbf65-ebbd-4587-bf5b-0b38421a4813", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1315246804-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1dc2a906d2242f79ffab81c2cf3c4d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50e9f0fc-d5", "ovs_interfaceid": "50e9f0fc-d5c3-4230-aea5-ef47736ac58f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 07:18:39 compute-0 nova_compute[189265]: 2025-09-30 07:18:39.269 2 DEBUG nova.network.os_vif_util [None req-ae0ab4e4-5a88-48fe-b264-f9aa76886ff5 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Converting VIF {"id": "50e9f0fc-d5c3-4230-aea5-ef47736ac58f", "address": "fa:16:3e:d0:55:ec", "network": {"id": "74ffbf65-ebbd-4587-bf5b-0b38421a4813", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1315246804-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1dc2a906d2242f79ffab81c2cf3c4d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50e9f0fc-d5", "ovs_interfaceid": "50e9f0fc-d5c3-4230-aea5-ef47736ac58f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:18:39 compute-0 nova_compute[189265]: 2025-09-30 07:18:39.270 2 DEBUG nova.network.os_vif_util [None req-ae0ab4e4-5a88-48fe-b264-f9aa76886ff5 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d0:55:ec,bridge_name='br-int',has_traffic_filtering=True,id=50e9f0fc-d5c3-4230-aea5-ef47736ac58f,network=Network(74ffbf65-ebbd-4587-bf5b-0b38421a4813),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50e9f0fc-d5') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:18:39 compute-0 nova_compute[189265]: 2025-09-30 07:18:39.270 2 DEBUG os_vif [None req-ae0ab4e4-5a88-48fe-b264-f9aa76886ff5 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d0:55:ec,bridge_name='br-int',has_traffic_filtering=True,id=50e9f0fc-d5c3-4230-aea5-ef47736ac58f,network=Network(74ffbf65-ebbd-4587-bf5b-0b38421a4813),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50e9f0fc-d5') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 07:18:39 compute-0 nova_compute[189265]: 2025-09-30 07:18:39.272 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:39 compute-0 nova_compute[189265]: 2025-09-30 07:18:39.273 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap50e9f0fc-d5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:18:39 compute-0 nova_compute[189265]: 2025-09-30 07:18:39.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:39 compute-0 nova_compute[189265]: 2025-09-30 07:18:39.277 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:18:39 compute-0 nova_compute[189265]: 2025-09-30 07:18:39.277 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:39 compute-0 nova_compute[189265]: 2025-09-30 07:18:39.278 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=879cef01-d914-4ea3-8bb8-74fe164e6086) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:18:39 compute-0 nova_compute[189265]: 2025-09-30 07:18:39.278 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:39 compute-0 nova_compute[189265]: 2025-09-30 07:18:39.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:39 compute-0 nova_compute[189265]: 2025-09-30 07:18:39.281 2 INFO os_vif [None req-ae0ab4e4-5a88-48fe-b264-f9aa76886ff5 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d0:55:ec,bridge_name='br-int',has_traffic_filtering=True,id=50e9f0fc-d5c3-4230-aea5-ef47736ac58f,network=Network(74ffbf65-ebbd-4587-bf5b-0b38421a4813),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50e9f0fc-d5')
Sep 30 07:18:39 compute-0 nova_compute[189265]: 2025-09-30 07:18:39.282 2 INFO nova.virt.libvirt.driver [None req-ae0ab4e4-5a88-48fe-b264-f9aa76886ff5 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: a62dd947-c757-461c-9dd7-2ccd8c8daf8c] Deleting instance files /var/lib/nova/instances/a62dd947-c757-461c-9dd7-2ccd8c8daf8c_del
Sep 30 07:18:39 compute-0 nova_compute[189265]: 2025-09-30 07:18:39.282 2 INFO nova.virt.libvirt.driver [None req-ae0ab4e4-5a88-48fe-b264-f9aa76886ff5 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: a62dd947-c757-461c-9dd7-2ccd8c8daf8c] Deletion of /var/lib/nova/instances/a62dd947-c757-461c-9dd7-2ccd8c8daf8c_del complete
Sep 30 07:18:39 compute-0 nova_compute[189265]: 2025-09-30 07:18:39.793 2 INFO nova.compute.manager [None req-ae0ab4e4-5a88-48fe-b264-f9aa76886ff5 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: a62dd947-c757-461c-9dd7-2ccd8c8daf8c] Took 1.29 seconds to destroy the instance on the hypervisor.
Sep 30 07:18:39 compute-0 nova_compute[189265]: 2025-09-30 07:18:39.793 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-ae0ab4e4-5a88-48fe-b264-f9aa76886ff5 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 07:18:39 compute-0 nova_compute[189265]: 2025-09-30 07:18:39.793 2 DEBUG nova.compute.manager [-] [instance: a62dd947-c757-461c-9dd7-2ccd8c8daf8c] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 07:18:39 compute-0 nova_compute[189265]: 2025-09-30 07:18:39.794 2 DEBUG nova.network.neutron [-] [instance: a62dd947-c757-461c-9dd7-2ccd8c8daf8c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 07:18:39 compute-0 nova_compute[189265]: 2025-09-30 07:18:39.794 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:18:39 compute-0 nova_compute[189265]: 2025-09-30 07:18:39.841 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Error from libvirt while getting description of instance-00000004: [Error Code 42] Domain not found: no domain with matching uuid 'a62dd947-c757-461c-9dd7-2ccd8c8daf8c' (instance-00000004): libvirt.libvirtError: Domain not found: no domain with matching uuid 'a62dd947-c757-461c-9dd7-2ccd8c8daf8c' (instance-00000004)
Sep 30 07:18:39 compute-0 nova_compute[189265]: 2025-09-30 07:18:39.846 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9fa193fb-a398-4552-85b4-a346dffcf697/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:18:39 compute-0 nova_compute[189265]: 2025-09-30 07:18:39.915 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9fa193fb-a398-4552-85b4-a346dffcf697/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:18:39 compute-0 nova_compute[189265]: 2025-09-30 07:18:39.917 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9fa193fb-a398-4552-85b4-a346dffcf697/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:18:39 compute-0 nova_compute[189265]: 2025-09-30 07:18:39.969 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9fa193fb-a398-4552-85b4-a346dffcf697/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:18:40 compute-0 nova_compute[189265]: 2025-09-30 07:18:40.112 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:18:40 compute-0 nova_compute[189265]: 2025-09-30 07:18:40.114 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:18:40 compute-0 nova_compute[189265]: 2025-09-30 07:18:40.133 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.019s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:18:40 compute-0 nova_compute[189265]: 2025-09-30 07:18:40.134 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5701MB free_disk=73.25019836425781GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:18:40 compute-0 nova_compute[189265]: 2025-09-30 07:18:40.134 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:18:40 compute-0 nova_compute[189265]: 2025-09-30 07:18:40.135 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:18:40 compute-0 nova_compute[189265]: 2025-09-30 07:18:40.142 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:18:40 compute-0 nova_compute[189265]: 2025-09-30 07:18:40.804 2 DEBUG nova.compute.manager [req-e8b4126d-0f30-450a-820c-ebb3163e2d94 req-5596f5f9-bd26-436c-acbb-ba6ce3cc9a98 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: a62dd947-c757-461c-9dd7-2ccd8c8daf8c] Received event network-vif-unplugged-50e9f0fc-d5c3-4230-aea5-ef47736ac58f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:18:40 compute-0 nova_compute[189265]: 2025-09-30 07:18:40.805 2 DEBUG oslo_concurrency.lockutils [req-e8b4126d-0f30-450a-820c-ebb3163e2d94 req-5596f5f9-bd26-436c-acbb-ba6ce3cc9a98 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "a62dd947-c757-461c-9dd7-2ccd8c8daf8c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:18:40 compute-0 nova_compute[189265]: 2025-09-30 07:18:40.806 2 DEBUG oslo_concurrency.lockutils [req-e8b4126d-0f30-450a-820c-ebb3163e2d94 req-5596f5f9-bd26-436c-acbb-ba6ce3cc9a98 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "a62dd947-c757-461c-9dd7-2ccd8c8daf8c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:18:40 compute-0 nova_compute[189265]: 2025-09-30 07:18:40.806 2 DEBUG oslo_concurrency.lockutils [req-e8b4126d-0f30-450a-820c-ebb3163e2d94 req-5596f5f9-bd26-436c-acbb-ba6ce3cc9a98 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "a62dd947-c757-461c-9dd7-2ccd8c8daf8c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:18:40 compute-0 nova_compute[189265]: 2025-09-30 07:18:40.806 2 DEBUG nova.compute.manager [req-e8b4126d-0f30-450a-820c-ebb3163e2d94 req-5596f5f9-bd26-436c-acbb-ba6ce3cc9a98 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: a62dd947-c757-461c-9dd7-2ccd8c8daf8c] No waiting events found dispatching network-vif-unplugged-50e9f0fc-d5c3-4230-aea5-ef47736ac58f pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:18:40 compute-0 nova_compute[189265]: 2025-09-30 07:18:40.807 2 DEBUG nova.compute.manager [req-e8b4126d-0f30-450a-820c-ebb3163e2d94 req-5596f5f9-bd26-436c-acbb-ba6ce3cc9a98 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: a62dd947-c757-461c-9dd7-2ccd8c8daf8c] Received event network-vif-unplugged-50e9f0fc-d5c3-4230-aea5-ef47736ac58f for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:18:40 compute-0 nova_compute[189265]: 2025-09-30 07:18:40.807 2 DEBUG nova.compute.manager [req-e8b4126d-0f30-450a-820c-ebb3163e2d94 req-5596f5f9-bd26-436c-acbb-ba6ce3cc9a98 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: a62dd947-c757-461c-9dd7-2ccd8c8daf8c] Received event network-vif-deleted-50e9f0fc-d5c3-4230-aea5-ef47736ac58f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:18:40 compute-0 nova_compute[189265]: 2025-09-30 07:18:40.808 2 INFO nova.compute.manager [req-e8b4126d-0f30-450a-820c-ebb3163e2d94 req-5596f5f9-bd26-436c-acbb-ba6ce3cc9a98 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: a62dd947-c757-461c-9dd7-2ccd8c8daf8c] Neutron deleted interface 50e9f0fc-d5c3-4230-aea5-ef47736ac58f; detaching it from the instance and deleting it from the info cache
Sep 30 07:18:40 compute-0 nova_compute[189265]: 2025-09-30 07:18:40.808 2 DEBUG nova.network.neutron [req-e8b4126d-0f30-450a-820c-ebb3163e2d94 req-5596f5f9-bd26-436c-acbb-ba6ce3cc9a98 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: a62dd947-c757-461c-9dd7-2ccd8c8daf8c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:18:40 compute-0 nova_compute[189265]: 2025-09-30 07:18:40.961 2 DEBUG nova.network.neutron [-] [instance: a62dd947-c757-461c-9dd7-2ccd8c8daf8c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:18:41 compute-0 nova_compute[189265]: 2025-09-30 07:18:41.215 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Instance 9fa193fb-a398-4552-85b4-a346dffcf697 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 07:18:41 compute-0 nova_compute[189265]: 2025-09-30 07:18:41.216 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Instance a62dd947-c757-461c-9dd7-2ccd8c8daf8c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 07:18:41 compute-0 nova_compute[189265]: 2025-09-30 07:18:41.216 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:18:41 compute-0 nova_compute[189265]: 2025-09-30 07:18:41.216 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:18:40 up  1:16,  0 user,  load average: 0.73, 0.48, 0.49\n', 'num_instances': '2', 'num_vm_active': '2', 'num_task_None': '1', 'num_os_type_None': '2', 'num_proj_1413b21c2db845e58d8a81f524a55f3a': '2', 'io_workload': '0', 'num_task_deleting': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:18:41 compute-0 nova_compute[189265]: 2025-09-30 07:18:41.293 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:18:41 compute-0 nova_compute[189265]: 2025-09-30 07:18:41.318 2 DEBUG nova.compute.manager [req-e8b4126d-0f30-450a-820c-ebb3163e2d94 req-5596f5f9-bd26-436c-acbb-ba6ce3cc9a98 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: a62dd947-c757-461c-9dd7-2ccd8c8daf8c] Detach interface failed, port_id=50e9f0fc-d5c3-4230-aea5-ef47736ac58f, reason: Instance a62dd947-c757-461c-9dd7-2ccd8c8daf8c could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Sep 30 07:18:41 compute-0 nova_compute[189265]: 2025-09-30 07:18:41.474 2 INFO nova.compute.manager [-] [instance: a62dd947-c757-461c-9dd7-2ccd8c8daf8c] Took 1.68 seconds to deallocate network for instance.
Sep 30 07:18:41 compute-0 nova_compute[189265]: 2025-09-30 07:18:41.811 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:18:41 compute-0 nova_compute[189265]: 2025-09-30 07:18:41.911 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:42 compute-0 nova_compute[189265]: 2025-09-30 07:18:42.005 2 DEBUG oslo_concurrency.lockutils [None req-ae0ab4e4-5a88-48fe-b264-f9aa76886ff5 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:18:42 compute-0 nova_compute[189265]: 2025-09-30 07:18:42.360 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:18:42 compute-0 nova_compute[189265]: 2025-09-30 07:18:42.361 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.226s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:18:42 compute-0 nova_compute[189265]: 2025-09-30 07:18:42.361 2 DEBUG oslo_concurrency.lockutils [None req-ae0ab4e4-5a88-48fe-b264-f9aa76886ff5 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.356s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:18:42 compute-0 nova_compute[189265]: 2025-09-30 07:18:42.362 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:18:42 compute-0 nova_compute[189265]: 2025-09-30 07:18:42.362 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Sep 30 07:18:42 compute-0 nova_compute[189265]: 2025-09-30 07:18:42.429 2 DEBUG nova.compute.provider_tree [None req-ae0ab4e4-5a88-48fe-b264-f9aa76886ff5 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:18:42 compute-0 nova_compute[189265]: 2025-09-30 07:18:42.873 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Sep 30 07:18:42 compute-0 nova_compute[189265]: 2025-09-30 07:18:42.945 2 DEBUG nova.scheduler.client.report [None req-ae0ab4e4-5a88-48fe-b264-f9aa76886ff5 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:18:43 compute-0 nova_compute[189265]: 2025-09-30 07:18:43.514 2 DEBUG oslo_concurrency.lockutils [None req-ae0ab4e4-5a88-48fe-b264-f9aa76886ff5 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.153s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:18:43 compute-0 nova_compute[189265]: 2025-09-30 07:18:43.534 2 INFO nova.scheduler.client.report [None req-ae0ab4e4-5a88-48fe-b264-f9aa76886ff5 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Deleted allocations for instance a62dd947-c757-461c-9dd7-2ccd8c8daf8c
Sep 30 07:18:44 compute-0 nova_compute[189265]: 2025-09-30 07:18:44.280 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:44 compute-0 podman[214355]: 2025-09-30 07:18:44.517009229 +0000 UTC m=+0.097363128 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, config_id=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.build-date=20250930)
Sep 30 07:18:44 compute-0 nova_compute[189265]: 2025-09-30 07:18:44.574 2 DEBUG oslo_concurrency.lockutils [None req-ae0ab4e4-5a88-48fe-b264-f9aa76886ff5 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "a62dd947-c757-461c-9dd7-2ccd8c8daf8c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.596s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:18:45 compute-0 nova_compute[189265]: 2025-09-30 07:18:45.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:18:45 compute-0 nova_compute[189265]: 2025-09-30 07:18:45.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:18:45 compute-0 nova_compute[189265]: 2025-09-30 07:18:45.789 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Sep 30 07:18:46 compute-0 nova_compute[189265]: 2025-09-30 07:18:46.151 2 DEBUG oslo_concurrency.lockutils [None req-6b9463a3-bd85-47ba-b062-c26653132cf0 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Acquiring lock "9fa193fb-a398-4552-85b4-a346dffcf697" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:18:46 compute-0 nova_compute[189265]: 2025-09-30 07:18:46.151 2 DEBUG oslo_concurrency.lockutils [None req-6b9463a3-bd85-47ba-b062-c26653132cf0 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "9fa193fb-a398-4552-85b4-a346dffcf697" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:18:46 compute-0 nova_compute[189265]: 2025-09-30 07:18:46.152 2 DEBUG oslo_concurrency.lockutils [None req-6b9463a3-bd85-47ba-b062-c26653132cf0 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Acquiring lock "9fa193fb-a398-4552-85b4-a346dffcf697-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:18:46 compute-0 nova_compute[189265]: 2025-09-30 07:18:46.152 2 DEBUG oslo_concurrency.lockutils [None req-6b9463a3-bd85-47ba-b062-c26653132cf0 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "9fa193fb-a398-4552-85b4-a346dffcf697-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:18:46 compute-0 nova_compute[189265]: 2025-09-30 07:18:46.152 2 DEBUG oslo_concurrency.lockutils [None req-6b9463a3-bd85-47ba-b062-c26653132cf0 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "9fa193fb-a398-4552-85b4-a346dffcf697-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:18:46 compute-0 nova_compute[189265]: 2025-09-30 07:18:46.175 2 INFO nova.compute.manager [None req-6b9463a3-bd85-47ba-b062-c26653132cf0 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Terminating instance
Sep 30 07:18:46 compute-0 nova_compute[189265]: 2025-09-30 07:18:46.699 2 DEBUG nova.compute.manager [None req-6b9463a3-bd85-47ba-b062-c26653132cf0 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 07:18:46 compute-0 kernel: tap5e18274a-8c (unregistering): left promiscuous mode
Sep 30 07:18:46 compute-0 NetworkManager[51813]: <info>  [1759216726.7280] device (tap5e18274a-8c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 07:18:46 compute-0 nova_compute[189265]: 2025-09-30 07:18:46.772 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:46 compute-0 ovn_controller[91436]: 2025-09-30T07:18:46Z|00085|binding|INFO|Releasing lport 5e18274a-8ca2-4391-88b8-e5a90d72fc7c from this chassis (sb_readonly=0)
Sep 30 07:18:46 compute-0 ovn_controller[91436]: 2025-09-30T07:18:46Z|00086|binding|INFO|Setting lport 5e18274a-8ca2-4391-88b8-e5a90d72fc7c down in Southbound
Sep 30 07:18:46 compute-0 ovn_controller[91436]: 2025-09-30T07:18:46Z|00087|binding|INFO|Removing iface tap5e18274a-8c ovn-installed in OVS
Sep 30 07:18:46 compute-0 nova_compute[189265]: 2025-09-30 07:18:46.774 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:46 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:46.779 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0d:a8:de 10.100.0.4'], port_security=['fa:16:3e:0d:a8:de 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '9fa193fb-a398-4552-85b4-a346dffcf697', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-74ffbf65-ebbd-4587-bf5b-0b38421a4813', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1413b21c2db845e58d8a81f524a55f3a', 'neutron:revision_number': '5', 'neutron:security_group_ids': '8ad3c6f6-3842-4d69-92ac-cef07b75c3bc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7b541691-433c-426c-b8b7-10d79319603a, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], logical_port=5e18274a-8ca2-4391-88b8-e5a90d72fc7c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:18:46 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:46.780 100322 INFO neutron.agent.ovn.metadata.agent [-] Port 5e18274a-8ca2-4391-88b8-e5a90d72fc7c in datapath 74ffbf65-ebbd-4587-bf5b-0b38421a4813 unbound from our chassis
Sep 30 07:18:46 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:46.782 100322 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 74ffbf65-ebbd-4587-bf5b-0b38421a4813, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 07:18:46 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:46.783 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[34cb9dd3-8e30-460b-9d26-1534847dd80d]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:18:46 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:46.784 100322 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813 namespace which is not needed anymore
Sep 30 07:18:46 compute-0 nova_compute[189265]: 2025-09-30 07:18:46.797 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:46 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000003.scope: Deactivated successfully.
Sep 30 07:18:46 compute-0 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000003.scope: Consumed 22.540s CPU time.
Sep 30 07:18:46 compute-0 systemd-machined[149233]: Machine qemu-1-instance-00000003 terminated.
Sep 30 07:18:46 compute-0 nova_compute[189265]: 2025-09-30 07:18:46.915 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:46 compute-0 nova_compute[189265]: 2025-09-30 07:18:46.976 2 INFO nova.virt.libvirt.driver [-] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Instance destroyed successfully.
Sep 30 07:18:46 compute-0 nova_compute[189265]: 2025-09-30 07:18:46.976 2 DEBUG nova.objects.instance [None req-6b9463a3-bd85-47ba-b062-c26653132cf0 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lazy-loading 'resources' on Instance uuid 9fa193fb-a398-4552-85b4-a346dffcf697 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:18:46 compute-0 neutron-haproxy-ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813[212820]: [NOTICE]   (212824) : haproxy version is 3.0.5-8e879a5
Sep 30 07:18:46 compute-0 neutron-haproxy-ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813[212820]: [NOTICE]   (212824) : path to executable is /usr/sbin/haproxy
Sep 30 07:18:46 compute-0 neutron-haproxy-ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813[212820]: [WARNING]  (212824) : Exiting Master process...
Sep 30 07:18:46 compute-0 podman[214399]: 2025-09-30 07:18:46.985481645 +0000 UTC m=+0.055774235 container kill 4f8fa3b9e27071334d0f82812cc3f92f254f7c3e1c81ac2da6da7bed7c85f659 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Sep 30 07:18:46 compute-0 neutron-haproxy-ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813[212820]: [ALERT]    (212824) : Current worker (212826) exited with code 143 (Terminated)
Sep 30 07:18:46 compute-0 neutron-haproxy-ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813[212820]: [WARNING]  (212824) : All workers exited. Exiting... (0)
Sep 30 07:18:46 compute-0 systemd[1]: libpod-4f8fa3b9e27071334d0f82812cc3f92f254f7c3e1c81ac2da6da7bed7c85f659.scope: Deactivated successfully.
Sep 30 07:18:47 compute-0 podman[214430]: 2025-09-30 07:18:47.045526055 +0000 UTC m=+0.036068172 container died 4f8fa3b9e27071334d0f82812cc3f92f254f7c3e1c81ac2da6da7bed7c85f659 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest)
Sep 30 07:18:47 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4f8fa3b9e27071334d0f82812cc3f92f254f7c3e1c81ac2da6da7bed7c85f659-userdata-shm.mount: Deactivated successfully.
Sep 30 07:18:47 compute-0 systemd[1]: var-lib-containers-storage-overlay-de4e4deace1333bc25798c2cea434bd6e9718bac418016a5573694616caedb8d-merged.mount: Deactivated successfully.
Sep 30 07:18:47 compute-0 podman[214430]: 2025-09-30 07:18:47.087948131 +0000 UTC m=+0.078490218 container cleanup 4f8fa3b9e27071334d0f82812cc3f92f254f7c3e1c81ac2da6da7bed7c85f659 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 07:18:47 compute-0 systemd[1]: libpod-conmon-4f8fa3b9e27071334d0f82812cc3f92f254f7c3e1c81ac2da6da7bed7c85f659.scope: Deactivated successfully.
Sep 30 07:18:47 compute-0 podman[214432]: 2025-09-30 07:18:47.109174119 +0000 UTC m=+0.090023054 container remove 4f8fa3b9e27071334d0f82812cc3f92f254f7c3e1c81ac2da6da7bed7c85f659 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 07:18:47 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:47.115 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[a10ae15f-0bdb-4cf1-8fd5-af1e8fe296a4]: (4, ("Tue Sep 30 07:18:46 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813 (4f8fa3b9e27071334d0f82812cc3f92f254f7c3e1c81ac2da6da7bed7c85f659)\n4f8fa3b9e27071334d0f82812cc3f92f254f7c3e1c81ac2da6da7bed7c85f659\nTue Sep 30 07:18:47 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813 (4f8fa3b9e27071334d0f82812cc3f92f254f7c3e1c81ac2da6da7bed7c85f659)\n4f8fa3b9e27071334d0f82812cc3f92f254f7c3e1c81ac2da6da7bed7c85f659\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:18:47 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:47.116 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[f843566b-f748-40f9-a4bf-fde760e6efb3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:18:47 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:47.116 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/74ffbf65-ebbd-4587-bf5b-0b38421a4813.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/74ffbf65-ebbd-4587-bf5b-0b38421a4813.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:18:47 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:47.117 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[482d2c44-2c22-44ee-8d3f-a4c27a33ec6b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:18:47 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:47.118 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap74ffbf65-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:18:47 compute-0 nova_compute[189265]: 2025-09-30 07:18:47.119 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:47 compute-0 kernel: tap74ffbf65-e0: left promiscuous mode
Sep 30 07:18:47 compute-0 nova_compute[189265]: 2025-09-30 07:18:47.137 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:47 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:47.140 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[f44437a6-f7b4-4b03-b00f-41e88d05abfa]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:18:47 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:47.168 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[53cb902f-3f93-442e-be3d-f3884e04f3f2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:18:47 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:47.169 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[fc2cc104-5d59-40a0-b5ad-ed1ff7834998]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:18:47 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:47.184 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[9d4b4b34-bbd1-4107-9442-1bebb6fead4a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 434695, 'reachable_time': 41364, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214464, 'error': None, 'target': 'ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:18:47 compute-0 systemd[1]: run-netns-ovnmeta\x2d74ffbf65\x2debbd\x2d4587\x2dbf5b\x2d0b38421a4813.mount: Deactivated successfully.
Sep 30 07:18:47 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:47.188 100440 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-74ffbf65-ebbd-4587-bf5b-0b38421a4813 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Sep 30 07:18:47 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:47.191 100440 DEBUG oslo.privsep.daemon [-] privsep: reply[23069d79-32a9-4245-b2c9-f2fdf4c4e25d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:18:47 compute-0 nova_compute[189265]: 2025-09-30 07:18:47.315 2 DEBUG nova.compute.manager [req-8c3276e3-16c1-4e53-86e0-3593149bcf26 req-751b65a0-7d75-46c0-89b9-107553284ca3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Received event network-vif-unplugged-5e18274a-8ca2-4391-88b8-e5a90d72fc7c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:18:47 compute-0 nova_compute[189265]: 2025-09-30 07:18:47.316 2 DEBUG oslo_concurrency.lockutils [req-8c3276e3-16c1-4e53-86e0-3593149bcf26 req-751b65a0-7d75-46c0-89b9-107553284ca3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "9fa193fb-a398-4552-85b4-a346dffcf697-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:18:47 compute-0 nova_compute[189265]: 2025-09-30 07:18:47.316 2 DEBUG oslo_concurrency.lockutils [req-8c3276e3-16c1-4e53-86e0-3593149bcf26 req-751b65a0-7d75-46c0-89b9-107553284ca3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "9fa193fb-a398-4552-85b4-a346dffcf697-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:18:47 compute-0 nova_compute[189265]: 2025-09-30 07:18:47.317 2 DEBUG oslo_concurrency.lockutils [req-8c3276e3-16c1-4e53-86e0-3593149bcf26 req-751b65a0-7d75-46c0-89b9-107553284ca3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "9fa193fb-a398-4552-85b4-a346dffcf697-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:18:47 compute-0 nova_compute[189265]: 2025-09-30 07:18:47.317 2 DEBUG nova.compute.manager [req-8c3276e3-16c1-4e53-86e0-3593149bcf26 req-751b65a0-7d75-46c0-89b9-107553284ca3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] No waiting events found dispatching network-vif-unplugged-5e18274a-8ca2-4391-88b8-e5a90d72fc7c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:18:47 compute-0 nova_compute[189265]: 2025-09-30 07:18:47.318 2 DEBUG nova.compute.manager [req-8c3276e3-16c1-4e53-86e0-3593149bcf26 req-751b65a0-7d75-46c0-89b9-107553284ca3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Received event network-vif-unplugged-5e18274a-8ca2-4391-88b8-e5a90d72fc7c for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:18:47 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:47.395 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '1a:26:7c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2e:60:fa:91:d0:34'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:18:47 compute-0 nova_compute[189265]: 2025-09-30 07:18:47.396 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:47 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:47.397 100322 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 07:18:47 compute-0 nova_compute[189265]: 2025-09-30 07:18:47.483 2 DEBUG nova.virt.libvirt.vif [None req-6b9463a3-bd85-47ba-b062-c26653132cf0 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-09-30T07:14:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteActionsViaActuator-server-1895073706',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteactionsviaactuator-server-1895073706',id=3,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T07:14:49Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1413b21c2db845e58d8a81f524a55f3a',ramdisk_id='',reservation_id='r-njnn49ef',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteActionsViaActuator-2061885601',owner_user_name='tempest-TestExecuteActionsViaActuator-2061885601-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T07:14:49Z,user_data=None,user_id='d6cb6be5d6fc407eb3abc1c7c70f5d77',uuid=9fa193fb-a398-4552-85b4-a346dffcf697,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5e18274a-8ca2-4391-88b8-e5a90d72fc7c", "address": "fa:16:3e:0d:a8:de", "network": {"id": "74ffbf65-ebbd-4587-bf5b-0b38421a4813", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1315246804-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1dc2a906d2242f79ffab81c2cf3c4d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e18274a-8c", "ovs_interfaceid": "5e18274a-8ca2-4391-88b8-e5a90d72fc7c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 07:18:47 compute-0 nova_compute[189265]: 2025-09-30 07:18:47.484 2 DEBUG nova.network.os_vif_util [None req-6b9463a3-bd85-47ba-b062-c26653132cf0 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Converting VIF {"id": "5e18274a-8ca2-4391-88b8-e5a90d72fc7c", "address": "fa:16:3e:0d:a8:de", "network": {"id": "74ffbf65-ebbd-4587-bf5b-0b38421a4813", "bridge": "br-int", "label": "tempest-TestExecuteActionsViaActuator-1315246804-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b1dc2a906d2242f79ffab81c2cf3c4d7", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5e18274a-8c", "ovs_interfaceid": "5e18274a-8ca2-4391-88b8-e5a90d72fc7c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:18:47 compute-0 nova_compute[189265]: 2025-09-30 07:18:47.485 2 DEBUG nova.network.os_vif_util [None req-6b9463a3-bd85-47ba-b062-c26653132cf0 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0d:a8:de,bridge_name='br-int',has_traffic_filtering=True,id=5e18274a-8ca2-4391-88b8-e5a90d72fc7c,network=Network(74ffbf65-ebbd-4587-bf5b-0b38421a4813),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e18274a-8c') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:18:47 compute-0 nova_compute[189265]: 2025-09-30 07:18:47.486 2 DEBUG os_vif [None req-6b9463a3-bd85-47ba-b062-c26653132cf0 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0d:a8:de,bridge_name='br-int',has_traffic_filtering=True,id=5e18274a-8ca2-4391-88b8-e5a90d72fc7c,network=Network(74ffbf65-ebbd-4587-bf5b-0b38421a4813),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e18274a-8c') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 07:18:47 compute-0 nova_compute[189265]: 2025-09-30 07:18:47.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:47 compute-0 nova_compute[189265]: 2025-09-30 07:18:47.489 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5e18274a-8c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:18:47 compute-0 nova_compute[189265]: 2025-09-30 07:18:47.491 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:47 compute-0 nova_compute[189265]: 2025-09-30 07:18:47.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:18:47 compute-0 nova_compute[189265]: 2025-09-30 07:18:47.495 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:47 compute-0 nova_compute[189265]: 2025-09-30 07:18:47.495 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=09c38c36-3856-4f0b-a8c2-35037b1cfe51) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:18:47 compute-0 nova_compute[189265]: 2025-09-30 07:18:47.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:47 compute-0 nova_compute[189265]: 2025-09-30 07:18:47.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:18:47 compute-0 nova_compute[189265]: 2025-09-30 07:18:47.501 2 INFO os_vif [None req-6b9463a3-bd85-47ba-b062-c26653132cf0 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0d:a8:de,bridge_name='br-int',has_traffic_filtering=True,id=5e18274a-8ca2-4391-88b8-e5a90d72fc7c,network=Network(74ffbf65-ebbd-4587-bf5b-0b38421a4813),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5e18274a-8c')
Sep 30 07:18:47 compute-0 nova_compute[189265]: 2025-09-30 07:18:47.502 2 INFO nova.virt.libvirt.driver [None req-6b9463a3-bd85-47ba-b062-c26653132cf0 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Deleting instance files /var/lib/nova/instances/9fa193fb-a398-4552-85b4-a346dffcf697_del
Sep 30 07:18:47 compute-0 nova_compute[189265]: 2025-09-30 07:18:47.503 2 INFO nova.virt.libvirt.driver [None req-6b9463a3-bd85-47ba-b062-c26653132cf0 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Deletion of /var/lib/nova/instances/9fa193fb-a398-4552-85b4-a346dffcf697_del complete
Sep 30 07:18:48 compute-0 nova_compute[189265]: 2025-09-30 07:18:48.017 2 INFO nova.compute.manager [None req-6b9463a3-bd85-47ba-b062-c26653132cf0 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Took 1.32 seconds to destroy the instance on the hypervisor.
Sep 30 07:18:48 compute-0 nova_compute[189265]: 2025-09-30 07:18:48.017 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-6b9463a3-bd85-47ba-b062-c26653132cf0 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 07:18:48 compute-0 nova_compute[189265]: 2025-09-30 07:18:48.018 2 DEBUG nova.compute.manager [-] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 07:18:48 compute-0 nova_compute[189265]: 2025-09-30 07:18:48.018 2 DEBUG nova.network.neutron [-] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 07:18:48 compute-0 nova_compute[189265]: 2025-09-30 07:18:48.019 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:18:48 compute-0 nova_compute[189265]: 2025-09-30 07:18:48.237 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:18:48 compute-0 podman[214467]: 2025-09-30 07:18:48.463991559 +0000 UTC m=+0.049964047 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., vcs-type=git, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, config_id=edpm, architecture=x86_64, distribution-scope=public)
Sep 30 07:18:49 compute-0 nova_compute[189265]: 2025-09-30 07:18:49.160 2 DEBUG nova.network.neutron [-] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:18:49 compute-0 nova_compute[189265]: 2025-09-30 07:18:49.402 2 DEBUG nova.compute.manager [req-d9f948b4-1f8e-4eaa-a7f8-42cb98428fef req-2cf12508-d18f-470f-a11e-e2ce64c48b92 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Received event network-vif-unplugged-5e18274a-8ca2-4391-88b8-e5a90d72fc7c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:18:49 compute-0 nova_compute[189265]: 2025-09-30 07:18:49.403 2 DEBUG oslo_concurrency.lockutils [req-d9f948b4-1f8e-4eaa-a7f8-42cb98428fef req-2cf12508-d18f-470f-a11e-e2ce64c48b92 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "9fa193fb-a398-4552-85b4-a346dffcf697-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:18:49 compute-0 nova_compute[189265]: 2025-09-30 07:18:49.404 2 DEBUG oslo_concurrency.lockutils [req-d9f948b4-1f8e-4eaa-a7f8-42cb98428fef req-2cf12508-d18f-470f-a11e-e2ce64c48b92 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "9fa193fb-a398-4552-85b4-a346dffcf697-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:18:49 compute-0 nova_compute[189265]: 2025-09-30 07:18:49.404 2 DEBUG oslo_concurrency.lockutils [req-d9f948b4-1f8e-4eaa-a7f8-42cb98428fef req-2cf12508-d18f-470f-a11e-e2ce64c48b92 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "9fa193fb-a398-4552-85b4-a346dffcf697-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:18:49 compute-0 nova_compute[189265]: 2025-09-30 07:18:49.404 2 DEBUG nova.compute.manager [req-d9f948b4-1f8e-4eaa-a7f8-42cb98428fef req-2cf12508-d18f-470f-a11e-e2ce64c48b92 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] No waiting events found dispatching network-vif-unplugged-5e18274a-8ca2-4391-88b8-e5a90d72fc7c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:18:49 compute-0 nova_compute[189265]: 2025-09-30 07:18:49.405 2 DEBUG nova.compute.manager [req-d9f948b4-1f8e-4eaa-a7f8-42cb98428fef req-2cf12508-d18f-470f-a11e-e2ce64c48b92 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Received event network-vif-unplugged-5e18274a-8ca2-4391-88b8-e5a90d72fc7c for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:18:49 compute-0 nova_compute[189265]: 2025-09-30 07:18:49.405 2 DEBUG nova.compute.manager [req-d9f948b4-1f8e-4eaa-a7f8-42cb98428fef req-2cf12508-d18f-470f-a11e-e2ce64c48b92 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Received event network-vif-deleted-5e18274a-8ca2-4391-88b8-e5a90d72fc7c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:18:49 compute-0 nova_compute[189265]: 2025-09-30 07:18:49.668 2 INFO nova.compute.manager [-] [instance: 9fa193fb-a398-4552-85b4-a346dffcf697] Took 1.65 seconds to deallocate network for instance.
Sep 30 07:18:50 compute-0 nova_compute[189265]: 2025-09-30 07:18:50.216 2 DEBUG oslo_concurrency.lockutils [None req-6b9463a3-bd85-47ba-b062-c26653132cf0 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:18:50 compute-0 nova_compute[189265]: 2025-09-30 07:18:50.217 2 DEBUG oslo_concurrency.lockutils [None req-6b9463a3-bd85-47ba-b062-c26653132cf0 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:18:50 compute-0 nova_compute[189265]: 2025-09-30 07:18:50.270 2 DEBUG nova.compute.provider_tree [None req-6b9463a3-bd85-47ba-b062-c26653132cf0 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:18:50 compute-0 nova_compute[189265]: 2025-09-30 07:18:50.777 2 DEBUG nova.scheduler.client.report [None req-6b9463a3-bd85-47ba-b062-c26653132cf0 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:18:51 compute-0 nova_compute[189265]: 2025-09-30 07:18:51.293 2 DEBUG oslo_concurrency.lockutils [None req-6b9463a3-bd85-47ba-b062-c26653132cf0 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.076s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:18:51 compute-0 nova_compute[189265]: 2025-09-30 07:18:51.325 2 INFO nova.scheduler.client.report [None req-6b9463a3-bd85-47ba-b062-c26653132cf0 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Deleted allocations for instance 9fa193fb-a398-4552-85b4-a346dffcf697
Sep 30 07:18:51 compute-0 podman[214488]: 2025-09-30 07:18:51.489321529 +0000 UTC m=+0.060179815 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=multipathd)
Sep 30 07:18:51 compute-0 podman[214489]: 2025-09-30 07:18:51.537219654 +0000 UTC m=+0.112734236 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 07:18:51 compute-0 nova_compute[189265]: 2025-09-30 07:18:51.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:52 compute-0 nova_compute[189265]: 2025-09-30 07:18:52.359 2 DEBUG oslo_concurrency.lockutils [None req-6b9463a3-bd85-47ba-b062-c26653132cf0 d6cb6be5d6fc407eb3abc1c7c70f5d77 1413b21c2db845e58d8a81f524a55f3a - - default default] Lock "9fa193fb-a398-4552-85b4-a346dffcf697" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.208s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:18:52 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:18:52.399 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=01429670-4ea1-4dab-babc-4bc628cc01bb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:18:52 compute-0 podman[214535]: 2025-09-30 07:18:52.47548392 +0000 UTC m=+0.056474287 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 07:18:52 compute-0 nova_compute[189265]: 2025-09-30 07:18:52.497 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:56 compute-0 nova_compute[189265]: 2025-09-30 07:18:56.915 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:57 compute-0 nova_compute[189265]: 2025-09-30 07:18:57.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:18:59 compute-0 podman[199733]: time="2025-09-30T07:18:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:18:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:18:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:18:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:18:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3006 "" "Go-http-client/1.1"
Sep 30 07:19:01 compute-0 openstack_network_exporter[201859]: ERROR   07:19:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:19:01 compute-0 openstack_network_exporter[201859]: ERROR   07:19:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:19:01 compute-0 openstack_network_exporter[201859]: ERROR   07:19:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:19:01 compute-0 openstack_network_exporter[201859]: ERROR   07:19:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:19:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:19:01 compute-0 openstack_network_exporter[201859]: ERROR   07:19:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:19:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:19:01 compute-0 nova_compute[189265]: 2025-09-30 07:19:01.917 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:19:02 compute-0 nova_compute[189265]: 2025-09-30 07:19:02.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:19:05 compute-0 nova_compute[189265]: 2025-09-30 07:19:05.067 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:19:06 compute-0 podman[214555]: 2025-09-30 07:19:06.480894029 +0000 UTC m=+0.064967393 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 07:19:06 compute-0 nova_compute[189265]: 2025-09-30 07:19:06.919 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:19:07 compute-0 nova_compute[189265]: 2025-09-30 07:19:07.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:19:11 compute-0 nova_compute[189265]: 2025-09-30 07:19:11.921 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:19:12 compute-0 nova_compute[189265]: 2025-09-30 07:19:12.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:19:15 compute-0 podman[214579]: 2025-09-30 07:19:15.484302894 +0000 UTC m=+0.063373787 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, container_name=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid)
Sep 30 07:19:16 compute-0 nova_compute[189265]: 2025-09-30 07:19:16.925 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:19:17 compute-0 nova_compute[189265]: 2025-09-30 07:19:17.505 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:19:19 compute-0 podman[214600]: 2025-09-30 07:19:19.50767397 +0000 UTC m=+0.082575547 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, managed_by=edpm_ansible, release=1755695350, vendor=Red Hat, Inc., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc.)
Sep 30 07:19:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:19:20.544 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:19:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:19:20.544 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:19:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:19:20.544 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:19:21 compute-0 nova_compute[189265]: 2025-09-30 07:19:21.926 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:19:22 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:19:22.476 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3d:e8:11 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-8bd4c178-e5a2-4919-a1df-9c84df6c5788', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8bd4c178-e5a2-4919-a1df-9c84df6c5788', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd196370b58a64910bc1103fb42505b15', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0a8dcf8-2d25-4318-a682-128f51c53fdc, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=f8042a54-712a-444f-8425-c77f4087b994) old=Port_Binding(mac=['fa:16:3e:3d:e8:11'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-8bd4c178-e5a2-4919-a1df-9c84df6c5788', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8bd4c178-e5a2-4919-a1df-9c84df6c5788', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd196370b58a64910bc1103fb42505b15', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:19:22 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:19:22.477 100322 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port f8042a54-712a-444f-8425-c77f4087b994 in datapath 8bd4c178-e5a2-4919-a1df-9c84df6c5788 updated
Sep 30 07:19:22 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:19:22.477 100322 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8bd4c178-e5a2-4919-a1df-9c84df6c5788, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 07:19:22 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:19:22.478 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[834b14b4-0edf-4a88-b91c-1b27ee664036]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:19:22 compute-0 podman[214622]: 2025-09-30 07:19:22.501889462 +0000 UTC m=+0.083480983 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Sep 30 07:19:22 compute-0 nova_compute[189265]: 2025-09-30 07:19:22.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:19:22 compute-0 podman[214632]: 2025-09-30 07:19:22.6123577 +0000 UTC m=+0.138684061 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Sep 30 07:19:22 compute-0 podman[214649]: 2025-09-30 07:19:22.642614942 +0000 UTC m=+0.105370261 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 07:19:26 compute-0 nova_compute[189265]: 2025-09-30 07:19:26.928 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:19:27 compute-0 nova_compute[189265]: 2025-09-30 07:19:27.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:19:29 compute-0 podman[199733]: time="2025-09-30T07:19:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:19:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:19:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:19:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:19:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3004 "" "Go-http-client/1.1"
Sep 30 07:19:31 compute-0 openstack_network_exporter[201859]: ERROR   07:19:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:19:31 compute-0 openstack_network_exporter[201859]: ERROR   07:19:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:19:31 compute-0 openstack_network_exporter[201859]: ERROR   07:19:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:19:31 compute-0 openstack_network_exporter[201859]: ERROR   07:19:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:19:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:19:31 compute-0 openstack_network_exporter[201859]: ERROR   07:19:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:19:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:19:31 compute-0 nova_compute[189265]: 2025-09-30 07:19:31.931 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:19:32 compute-0 nova_compute[189265]: 2025-09-30 07:19:32.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:19:33 compute-0 nova_compute[189265]: 2025-09-30 07:19:33.290 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:19:33 compute-0 nova_compute[189265]: 2025-09-30 07:19:33.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:19:34 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:19:34.290 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:47:9b:1c 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-d9c8b47b-8756-4b06-9746-9e2f99fda03f', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d9c8b47b-8756-4b06-9746-9e2f99fda03f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5914ab8585ff4a26a783d58aae38b75d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4f0d2b1d-10e6-42bc-9a53-c7d127d5568b, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=071b9187-472b-44f4-ba84-48bb14d2fa14) old=Port_Binding(mac=['fa:16:3e:47:9b:1c'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-d9c8b47b-8756-4b06-9746-9e2f99fda03f', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d9c8b47b-8756-4b06-9746-9e2f99fda03f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5914ab8585ff4a26a783d58aae38b75d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:19:34 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:19:34.291 100322 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 071b9187-472b-44f4-ba84-48bb14d2fa14 in datapath d9c8b47b-8756-4b06-9746-9e2f99fda03f updated
Sep 30 07:19:34 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:19:34.293 100322 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d9c8b47b-8756-4b06-9746-9e2f99fda03f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 07:19:34 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:19:34.294 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[90868d9c-c37d-4fe5-801c-03642b1cd1de]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:19:35 compute-0 nova_compute[189265]: 2025-09-30 07:19:35.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:19:35 compute-0 nova_compute[189265]: 2025-09-30 07:19:35.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:19:35 compute-0 nova_compute[189265]: 2025-09-30 07:19:35.788 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:19:36 compute-0 nova_compute[189265]: 2025-09-30 07:19:36.789 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:19:36 compute-0 nova_compute[189265]: 2025-09-30 07:19:36.934 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:19:37 compute-0 podman[214681]: 2025-09-30 07:19:37.476253326 +0000 UTC m=+0.051985199 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 07:19:37 compute-0 nova_compute[189265]: 2025-09-30 07:19:37.513 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:19:37 compute-0 nova_compute[189265]: 2025-09-30 07:19:37.783 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:19:38 compute-0 nova_compute[189265]: 2025-09-30 07:19:38.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:19:39 compute-0 nova_compute[189265]: 2025-09-30 07:19:39.309 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:19:39 compute-0 nova_compute[189265]: 2025-09-30 07:19:39.309 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:19:39 compute-0 nova_compute[189265]: 2025-09-30 07:19:39.310 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:19:39 compute-0 nova_compute[189265]: 2025-09-30 07:19:39.310 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:19:39 compute-0 nova_compute[189265]: 2025-09-30 07:19:39.434 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:19:39 compute-0 nova_compute[189265]: 2025-09-30 07:19:39.435 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:19:39 compute-0 nova_compute[189265]: 2025-09-30 07:19:39.449 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.014s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:19:39 compute-0 nova_compute[189265]: 2025-09-30 07:19:39.449 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5869MB free_disk=73.30793380737305GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:19:39 compute-0 nova_compute[189265]: 2025-09-30 07:19:39.450 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:19:39 compute-0 nova_compute[189265]: 2025-09-30 07:19:39.450 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:19:40 compute-0 nova_compute[189265]: 2025-09-30 07:19:40.694 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:19:40 compute-0 nova_compute[189265]: 2025-09-30 07:19:40.695 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:19:39 up  1:17,  0 user,  load average: 0.27, 0.39, 0.45\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:19:40 compute-0 nova_compute[189265]: 2025-09-30 07:19:40.738 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Refreshing inventories for resource provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Sep 30 07:19:40 compute-0 nova_compute[189265]: 2025-09-30 07:19:40.792 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Updating ProviderTree inventory for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Sep 30 07:19:40 compute-0 nova_compute[189265]: 2025-09-30 07:19:40.793 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Updating inventory in ProviderTree for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Sep 30 07:19:40 compute-0 nova_compute[189265]: 2025-09-30 07:19:40.807 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Refreshing aggregate associations for resource provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Sep 30 07:19:40 compute-0 nova_compute[189265]: 2025-09-30 07:19:40.826 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Refreshing trait associations for resource provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc, traits: COMPUTE_SECURITY_TPM_CRB,HW_ARCH_X86_64,HW_CPU_X86_F16C,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AESNI,COMPUTE_STORAGE_VIRTIO_FS,HW_CPU_X86_FMA3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE,COMPUTE_ACCELERATORS,HW_CPU_X86_SVM,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_EXTEND,COMPUTE_ARCH_X86_64,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SHA,HW_CPU_X86_BMI,COMPUTE_SOUND_MODEL_USB,COMPUTE_SOUND_MODEL_SB16,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AMD_SVM,HW_CPU_X86_BMI2,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_TIS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_AVX,COMPUTE_SOUND_MODEL_AC97,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_ABM,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_VIF_MODEL_IGB,HW_CPU_X86_SSE41,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SOUND_MODEL_ICH6,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_MMX,HW_CPU_X86_SSE4A,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_PCSPK,HW_CPU_X86_CLMUL _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Sep 30 07:19:40 compute-0 nova_compute[189265]: 2025-09-30 07:19:40.845 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:19:41 compute-0 nova_compute[189265]: 2025-09-30 07:19:41.379 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:19:41 compute-0 nova_compute[189265]: 2025-09-30 07:19:41.934 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:19:41 compute-0 nova_compute[189265]: 2025-09-30 07:19:41.934 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.484s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:19:41 compute-0 nova_compute[189265]: 2025-09-30 07:19:41.935 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:19:42 compute-0 nova_compute[189265]: 2025-09-30 07:19:42.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:19:42 compute-0 ovn_controller[91436]: 2025-09-30T07:19:42Z|00088|memory_trim|INFO|Detected inactivity (last active 30011 ms ago): trimming memory
Sep 30 07:19:42 compute-0 nova_compute[189265]: 2025-09-30 07:19:42.934 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:19:44 compute-0 nova_compute[189265]: 2025-09-30 07:19:44.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:19:46 compute-0 podman[214708]: 2025-09-30 07:19:46.472247535 +0000 UTC m=+0.054157054 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Sep 30 07:19:46 compute-0 nova_compute[189265]: 2025-09-30 07:19:46.938 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:19:47 compute-0 nova_compute[189265]: 2025-09-30 07:19:47.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:19:47 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Sep 30 07:19:50 compute-0 podman[214731]: 2025-09-30 07:19:50.481200856 +0000 UTC m=+0.063959180 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.33.7, version=9.6, vcs-type=git, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, architecture=x86_64, name=ubi9-minimal, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Sep 30 07:19:51 compute-0 nova_compute[189265]: 2025-09-30 07:19:51.939 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:19:52 compute-0 nova_compute[189265]: 2025-09-30 07:19:52.516 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:19:53 compute-0 podman[214753]: 2025-09-30 07:19:53.503361229 +0000 UTC m=+0.083034397 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Sep 30 07:19:53 compute-0 podman[214754]: 2025-09-30 07:19:53.503330088 +0000 UTC m=+0.088098205 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Sep 30 07:19:53 compute-0 podman[214755]: 2025-09-30 07:19:53.537305301 +0000 UTC m=+0.114177057 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Sep 30 07:19:56 compute-0 nova_compute[189265]: 2025-09-30 07:19:56.942 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:19:57 compute-0 nova_compute[189265]: 2025-09-30 07:19:57.568 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:19:59 compute-0 podman[199733]: time="2025-09-30T07:19:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:19:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:19:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:19:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:19:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3004 "" "Go-http-client/1.1"
Sep 30 07:20:01 compute-0 anacron[168842]: Job `cron.daily' started
Sep 30 07:20:01 compute-0 anacron[168842]: Job `cron.daily' terminated
Sep 30 07:20:01 compute-0 openstack_network_exporter[201859]: ERROR   07:20:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:20:01 compute-0 openstack_network_exporter[201859]: ERROR   07:20:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:20:01 compute-0 openstack_network_exporter[201859]: ERROR   07:20:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:20:01 compute-0 openstack_network_exporter[201859]: ERROR   07:20:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:20:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:20:01 compute-0 openstack_network_exporter[201859]: ERROR   07:20:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:20:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:20:01 compute-0 nova_compute[189265]: 2025-09-30 07:20:01.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:20:02 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:20:02.061 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '1a:26:7c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2e:60:fa:91:d0:34'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:20:02 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:20:02.062 100322 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 07:20:02 compute-0 nova_compute[189265]: 2025-09-30 07:20:02.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:20:02 compute-0 nova_compute[189265]: 2025-09-30 07:20:02.621 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:20:06 compute-0 nova_compute[189265]: 2025-09-30 07:20:06.945 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:20:07 compute-0 nova_compute[189265]: 2025-09-30 07:20:07.622 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:20:08 compute-0 podman[214818]: 2025-09-30 07:20:08.469311678 +0000 UTC m=+0.051890357 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 07:20:10 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:20:10.064 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=01429670-4ea1-4dab-babc-4bc628cc01bb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:20:10 compute-0 nova_compute[189265]: 2025-09-30 07:20:10.871 2 DEBUG oslo_concurrency.lockutils [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Acquiring lock "7951b572-4bd4-472b-99e6-32d37b2ea3fd" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:20:10 compute-0 nova_compute[189265]: 2025-09-30 07:20:10.872 2 DEBUG oslo_concurrency.lockutils [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Lock "7951b572-4bd4-472b-99e6-32d37b2ea3fd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:20:11 compute-0 nova_compute[189265]: 2025-09-30 07:20:11.379 2 DEBUG nova.compute.manager [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Sep 30 07:20:11 compute-0 nova_compute[189265]: 2025-09-30 07:20:11.942 2 DEBUG oslo_concurrency.lockutils [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:20:11 compute-0 nova_compute[189265]: 2025-09-30 07:20:11.943 2 DEBUG oslo_concurrency.lockutils [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:20:11 compute-0 nova_compute[189265]: 2025-09-30 07:20:11.948 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:20:11 compute-0 nova_compute[189265]: 2025-09-30 07:20:11.951 2 DEBUG nova.virt.hardware [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Sep 30 07:20:11 compute-0 nova_compute[189265]: 2025-09-30 07:20:11.952 2 INFO nova.compute.claims [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Claim successful on node compute-0.ctlplane.example.com
Sep 30 07:20:12 compute-0 nova_compute[189265]: 2025-09-30 07:20:12.624 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:20:13 compute-0 nova_compute[189265]: 2025-09-30 07:20:13.021 2 DEBUG nova.compute.provider_tree [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:20:13 compute-0 nova_compute[189265]: 2025-09-30 07:20:13.539 2 DEBUG nova.scheduler.client.report [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:20:14 compute-0 nova_compute[189265]: 2025-09-30 07:20:14.048 2 DEBUG oslo_concurrency.lockutils [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.106s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:20:14 compute-0 nova_compute[189265]: 2025-09-30 07:20:14.049 2 DEBUG nova.compute.manager [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Sep 30 07:20:14 compute-0 nova_compute[189265]: 2025-09-30 07:20:14.567 2 DEBUG nova.compute.manager [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Sep 30 07:20:14 compute-0 nova_compute[189265]: 2025-09-30 07:20:14.568 2 DEBUG nova.network.neutron [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Sep 30 07:20:14 compute-0 nova_compute[189265]: 2025-09-30 07:20:14.569 2 WARNING neutronclient.v2_0.client [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:20:14 compute-0 nova_compute[189265]: 2025-09-30 07:20:14.569 2 WARNING neutronclient.v2_0.client [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:20:15 compute-0 nova_compute[189265]: 2025-09-30 07:20:15.081 2 INFO nova.virt.libvirt.driver [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 07:20:15 compute-0 nova_compute[189265]: 2025-09-30 07:20:15.591 2 DEBUG nova.compute.manager [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Sep 30 07:20:16 compute-0 nova_compute[189265]: 2025-09-30 07:20:16.459 2 DEBUG nova.network.neutron [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Successfully created port: 1624cd02-73d5-4555-b8de-b38f00887c31 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Sep 30 07:20:16 compute-0 nova_compute[189265]: 2025-09-30 07:20:16.610 2 DEBUG nova.compute.manager [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Sep 30 07:20:16 compute-0 nova_compute[189265]: 2025-09-30 07:20:16.612 2 DEBUG nova.virt.libvirt.driver [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Sep 30 07:20:16 compute-0 nova_compute[189265]: 2025-09-30 07:20:16.613 2 INFO nova.virt.libvirt.driver [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Creating image(s)
Sep 30 07:20:16 compute-0 nova_compute[189265]: 2025-09-30 07:20:16.614 2 DEBUG oslo_concurrency.lockutils [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Acquiring lock "/var/lib/nova/instances/7951b572-4bd4-472b-99e6-32d37b2ea3fd/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:20:16 compute-0 nova_compute[189265]: 2025-09-30 07:20:16.614 2 DEBUG oslo_concurrency.lockutils [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Lock "/var/lib/nova/instances/7951b572-4bd4-472b-99e6-32d37b2ea3fd/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:20:16 compute-0 nova_compute[189265]: 2025-09-30 07:20:16.615 2 DEBUG oslo_concurrency.lockutils [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Lock "/var/lib/nova/instances/7951b572-4bd4-472b-99e6-32d37b2ea3fd/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:20:16 compute-0 nova_compute[189265]: 2025-09-30 07:20:16.616 2 DEBUG oslo_utils.imageutils.format_inspector [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:20:16 compute-0 nova_compute[189265]: 2025-09-30 07:20:16.623 2 DEBUG oslo_utils.imageutils.format_inspector [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:20:16 compute-0 nova_compute[189265]: 2025-09-30 07:20:16.625 2 DEBUG oslo_concurrency.processutils [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:20:16 compute-0 nova_compute[189265]: 2025-09-30 07:20:16.705 2 DEBUG oslo_concurrency.processutils [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:20:16 compute-0 nova_compute[189265]: 2025-09-30 07:20:16.707 2 DEBUG oslo_concurrency.lockutils [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Acquiring lock "649c128805005f3dfb5a93843c58a367cdfe939d" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:20:16 compute-0 nova_compute[189265]: 2025-09-30 07:20:16.708 2 DEBUG oslo_concurrency.lockutils [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Lock "649c128805005f3dfb5a93843c58a367cdfe939d" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:20:16 compute-0 nova_compute[189265]: 2025-09-30 07:20:16.709 2 DEBUG oslo_utils.imageutils.format_inspector [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:20:16 compute-0 nova_compute[189265]: 2025-09-30 07:20:16.715 2 DEBUG oslo_utils.imageutils.format_inspector [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:20:16 compute-0 nova_compute[189265]: 2025-09-30 07:20:16.716 2 DEBUG oslo_concurrency.processutils [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:20:16 compute-0 nova_compute[189265]: 2025-09-30 07:20:16.801 2 DEBUG oslo_concurrency.processutils [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:20:16 compute-0 nova_compute[189265]: 2025-09-30 07:20:16.802 2 DEBUG oslo_concurrency.processutils [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d,backing_fmt=raw /var/lib/nova/instances/7951b572-4bd4-472b-99e6-32d37b2ea3fd/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:20:16 compute-0 nova_compute[189265]: 2025-09-30 07:20:16.837 2 DEBUG oslo_concurrency.processutils [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d,backing_fmt=raw /var/lib/nova/instances/7951b572-4bd4-472b-99e6-32d37b2ea3fd/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:20:16 compute-0 nova_compute[189265]: 2025-09-30 07:20:16.838 2 DEBUG oslo_concurrency.lockutils [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Lock "649c128805005f3dfb5a93843c58a367cdfe939d" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.130s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:20:16 compute-0 nova_compute[189265]: 2025-09-30 07:20:16.839 2 DEBUG oslo_concurrency.processutils [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:20:16 compute-0 nova_compute[189265]: 2025-09-30 07:20:16.900 2 DEBUG oslo_concurrency.processutils [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:20:16 compute-0 nova_compute[189265]: 2025-09-30 07:20:16.901 2 DEBUG nova.virt.disk.api [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Checking if we can resize image /var/lib/nova/instances/7951b572-4bd4-472b-99e6-32d37b2ea3fd/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Sep 30 07:20:16 compute-0 nova_compute[189265]: 2025-09-30 07:20:16.901 2 DEBUG oslo_concurrency.processutils [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7951b572-4bd4-472b-99e6-32d37b2ea3fd/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:20:16 compute-0 nova_compute[189265]: 2025-09-30 07:20:16.952 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:20:16 compute-0 nova_compute[189265]: 2025-09-30 07:20:16.990 2 DEBUG oslo_concurrency.processutils [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7951b572-4bd4-472b-99e6-32d37b2ea3fd/disk --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:20:16 compute-0 nova_compute[189265]: 2025-09-30 07:20:16.991 2 DEBUG nova.virt.disk.api [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Cannot resize image /var/lib/nova/instances/7951b572-4bd4-472b-99e6-32d37b2ea3fd/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Sep 30 07:20:16 compute-0 nova_compute[189265]: 2025-09-30 07:20:16.991 2 DEBUG nova.virt.libvirt.driver [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Sep 30 07:20:16 compute-0 nova_compute[189265]: 2025-09-30 07:20:16.992 2 DEBUG nova.virt.libvirt.driver [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Ensure instance console log exists: /var/lib/nova/instances/7951b572-4bd4-472b-99e6-32d37b2ea3fd/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 07:20:16 compute-0 nova_compute[189265]: 2025-09-30 07:20:16.993 2 DEBUG oslo_concurrency.lockutils [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:20:16 compute-0 nova_compute[189265]: 2025-09-30 07:20:16.993 2 DEBUG oslo_concurrency.lockutils [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:20:16 compute-0 nova_compute[189265]: 2025-09-30 07:20:16.994 2 DEBUG oslo_concurrency.lockutils [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:20:17 compute-0 nova_compute[189265]: 2025-09-30 07:20:17.282 2 DEBUG nova.network.neutron [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Successfully updated port: 1624cd02-73d5-4555-b8de-b38f00887c31 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Sep 30 07:20:17 compute-0 nova_compute[189265]: 2025-09-30 07:20:17.366 2 DEBUG nova.compute.manager [req-cf6a2198-3726-4929-8bc2-d27afcbb84fa req-23c8db96-28fc-42a0-927e-625da31a130b 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Received event network-changed-1624cd02-73d5-4555-b8de-b38f00887c31 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:20:17 compute-0 nova_compute[189265]: 2025-09-30 07:20:17.367 2 DEBUG nova.compute.manager [req-cf6a2198-3726-4929-8bc2-d27afcbb84fa req-23c8db96-28fc-42a0-927e-625da31a130b 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Refreshing instance network info cache due to event network-changed-1624cd02-73d5-4555-b8de-b38f00887c31. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 07:20:17 compute-0 nova_compute[189265]: 2025-09-30 07:20:17.367 2 DEBUG oslo_concurrency.lockutils [req-cf6a2198-3726-4929-8bc2-d27afcbb84fa req-23c8db96-28fc-42a0-927e-625da31a130b 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "refresh_cache-7951b572-4bd4-472b-99e6-32d37b2ea3fd" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:20:17 compute-0 nova_compute[189265]: 2025-09-30 07:20:17.367 2 DEBUG oslo_concurrency.lockutils [req-cf6a2198-3726-4929-8bc2-d27afcbb84fa req-23c8db96-28fc-42a0-927e-625da31a130b 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquired lock "refresh_cache-7951b572-4bd4-472b-99e6-32d37b2ea3fd" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:20:17 compute-0 nova_compute[189265]: 2025-09-30 07:20:17.368 2 DEBUG nova.network.neutron [req-cf6a2198-3726-4929-8bc2-d27afcbb84fa req-23c8db96-28fc-42a0-927e-625da31a130b 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Refreshing network info cache for port 1624cd02-73d5-4555-b8de-b38f00887c31 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 07:20:17 compute-0 podman[214856]: 2025-09-30 07:20:17.487905016 +0000 UTC m=+0.074376154 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=iscsid)
Sep 30 07:20:17 compute-0 nova_compute[189265]: 2025-09-30 07:20:17.626 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:20:17 compute-0 nova_compute[189265]: 2025-09-30 07:20:17.789 2 DEBUG oslo_concurrency.lockutils [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Acquiring lock "refresh_cache-7951b572-4bd4-472b-99e6-32d37b2ea3fd" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:20:17 compute-0 nova_compute[189265]: 2025-09-30 07:20:17.874 2 WARNING neutronclient.v2_0.client [req-cf6a2198-3726-4929-8bc2-d27afcbb84fa req-23c8db96-28fc-42a0-927e-625da31a130b 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:20:18 compute-0 nova_compute[189265]: 2025-09-30 07:20:18.302 2 DEBUG nova.network.neutron [req-cf6a2198-3726-4929-8bc2-d27afcbb84fa req-23c8db96-28fc-42a0-927e-625da31a130b 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 07:20:18 compute-0 nova_compute[189265]: 2025-09-30 07:20:18.454 2 DEBUG nova.network.neutron [req-cf6a2198-3726-4929-8bc2-d27afcbb84fa req-23c8db96-28fc-42a0-927e-625da31a130b 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:20:18 compute-0 nova_compute[189265]: 2025-09-30 07:20:18.959 2 DEBUG oslo_concurrency.lockutils [req-cf6a2198-3726-4929-8bc2-d27afcbb84fa req-23c8db96-28fc-42a0-927e-625da31a130b 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Releasing lock "refresh_cache-7951b572-4bd4-472b-99e6-32d37b2ea3fd" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:20:18 compute-0 nova_compute[189265]: 2025-09-30 07:20:18.960 2 DEBUG oslo_concurrency.lockutils [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Acquired lock "refresh_cache-7951b572-4bd4-472b-99e6-32d37b2ea3fd" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:20:18 compute-0 nova_compute[189265]: 2025-09-30 07:20:18.961 2 DEBUG nova.network.neutron [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 07:20:20 compute-0 nova_compute[189265]: 2025-09-30 07:20:20.304 2 DEBUG nova.network.neutron [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 07:20:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:20:20.545 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:20:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:20:20.546 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:20:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:20:20.546 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:20:21 compute-0 nova_compute[189265]: 2025-09-30 07:20:21.282 2 WARNING neutronclient.v2_0.client [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:20:21 compute-0 nova_compute[189265]: 2025-09-30 07:20:21.455 2 DEBUG nova.network.neutron [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Updating instance_info_cache with network_info: [{"id": "1624cd02-73d5-4555-b8de-b38f00887c31", "address": "fa:16:3e:45:04:3d", "network": {"id": "8bd4c178-e5a2-4919-a1df-9c84df6c5788", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-389279842-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d196370b58a64910bc1103fb42505b15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1624cd02-73", "ovs_interfaceid": "1624cd02-73d5-4555-b8de-b38f00887c31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:20:21 compute-0 podman[214877]: 2025-09-30 07:20:21.476744701 +0000 UTC m=+0.066513004 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., distribution-scope=public, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, name=ubi9-minimal, version=9.6)
Sep 30 07:20:21 compute-0 nova_compute[189265]: 2025-09-30 07:20:21.955 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:20:21 compute-0 nova_compute[189265]: 2025-09-30 07:20:21.963 2 DEBUG oslo_concurrency.lockutils [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Releasing lock "refresh_cache-7951b572-4bd4-472b-99e6-32d37b2ea3fd" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:20:21 compute-0 nova_compute[189265]: 2025-09-30 07:20:21.964 2 DEBUG nova.compute.manager [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Instance network_info: |[{"id": "1624cd02-73d5-4555-b8de-b38f00887c31", "address": "fa:16:3e:45:04:3d", "network": {"id": "8bd4c178-e5a2-4919-a1df-9c84df6c5788", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-389279842-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d196370b58a64910bc1103fb42505b15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1624cd02-73", "ovs_interfaceid": "1624cd02-73d5-4555-b8de-b38f00887c31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Sep 30 07:20:21 compute-0 nova_compute[189265]: 2025-09-30 07:20:21.967 2 DEBUG nova.virt.libvirt.driver [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Start _get_guest_xml network_info=[{"id": "1624cd02-73d5-4555-b8de-b38f00887c31", "address": "fa:16:3e:45:04:3d", "network": {"id": "8bd4c178-e5a2-4919-a1df-9c84df6c5788", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-389279842-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d196370b58a64910bc1103fb42505b15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1624cd02-73", "ovs_interfaceid": "1624cd02-73d5-4555-b8de-b38f00887c31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T07:07:59Z,direct_url=<?>,disk_format='qcow2',id=0c6b92f5-9861-49e4-862d-3ffd84520dfa,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4049964ce8244dacb50493f6676c6613',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T07:08:00Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'encryption_format': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encrypted': False, 'image_id': '0c6b92f5-9861-49e4-862d-3ffd84520dfa'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Sep 30 07:20:21 compute-0 nova_compute[189265]: 2025-09-30 07:20:21.972 2 WARNING nova.virt.libvirt.driver [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:20:21 compute-0 nova_compute[189265]: 2025-09-30 07:20:21.975 2 DEBUG nova.virt.driver [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='0c6b92f5-9861-49e4-862d-3ffd84520dfa', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteBasicStrategy-server-194258953', uuid='7951b572-4bd4-472b-99e6-32d37b2ea3fd'), owner=OwnerMeta(userid='3a4b8ff28f3345afb27f6afbb0a20f3b', username='tempest-TestExecuteBasicStrategy-698431161-project-admin', projectid='5914ab8585ff4a26a783d58aae38b75d', projectname='tempest-TestExecuteBasicStrategy-698431161'), image=ImageMeta(id='0c6b92f5-9861-49e4-862d-3ffd84520dfa', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='ded17455-f8fe-40c7-8dae-6f0a2b208ae0', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "1624cd02-73d5-4555-b8de-b38f00887c31", "address": "fa:16:3e:45:04:3d", "network": {"id": "8bd4c178-e5a2-4919-a1df-9c84df6c5788", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-389279842-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d196370b58a64910bc1103fb42505b15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1624cd02-73", "ovs_interfaceid": "1624cd02-73d5-4555-b8de-b38f00887c31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759216821.974964) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Sep 30 07:20:21 compute-0 nova_compute[189265]: 2025-09-30 07:20:21.979 2 DEBUG nova.virt.libvirt.host [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Sep 30 07:20:21 compute-0 nova_compute[189265]: 2025-09-30 07:20:21.981 2 DEBUG nova.virt.libvirt.host [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Sep 30 07:20:21 compute-0 nova_compute[189265]: 2025-09-30 07:20:21.984 2 DEBUG nova.virt.libvirt.host [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Sep 30 07:20:21 compute-0 nova_compute[189265]: 2025-09-30 07:20:21.985 2 DEBUG nova.virt.libvirt.host [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Sep 30 07:20:21 compute-0 nova_compute[189265]: 2025-09-30 07:20:21.986 2 DEBUG nova.virt.libvirt.driver [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 07:20:21 compute-0 nova_compute[189265]: 2025-09-30 07:20:21.986 2 DEBUG nova.virt.hardware [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T07:07:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='ded17455-f8fe-40c7-8dae-6f0a2b208ae0',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T07:07:59Z,direct_url=<?>,disk_format='qcow2',id=0c6b92f5-9861-49e4-862d-3ffd84520dfa,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4049964ce8244dacb50493f6676c6613',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T07:08:00Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Sep 30 07:20:21 compute-0 nova_compute[189265]: 2025-09-30 07:20:21.987 2 DEBUG nova.virt.hardware [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Sep 30 07:20:21 compute-0 nova_compute[189265]: 2025-09-30 07:20:21.988 2 DEBUG nova.virt.hardware [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Sep 30 07:20:21 compute-0 nova_compute[189265]: 2025-09-30 07:20:21.988 2 DEBUG nova.virt.hardware [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Sep 30 07:20:21 compute-0 nova_compute[189265]: 2025-09-30 07:20:21.989 2 DEBUG nova.virt.hardware [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Sep 30 07:20:21 compute-0 nova_compute[189265]: 2025-09-30 07:20:21.989 2 DEBUG nova.virt.hardware [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Sep 30 07:20:21 compute-0 nova_compute[189265]: 2025-09-30 07:20:21.990 2 DEBUG nova.virt.hardware [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Sep 30 07:20:21 compute-0 nova_compute[189265]: 2025-09-30 07:20:21.990 2 DEBUG nova.virt.hardware [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Sep 30 07:20:21 compute-0 nova_compute[189265]: 2025-09-30 07:20:21.991 2 DEBUG nova.virt.hardware [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Sep 30 07:20:21 compute-0 nova_compute[189265]: 2025-09-30 07:20:21.991 2 DEBUG nova.virt.hardware [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Sep 30 07:20:21 compute-0 nova_compute[189265]: 2025-09-30 07:20:21.992 2 DEBUG nova.virt.hardware [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Sep 30 07:20:21 compute-0 nova_compute[189265]: 2025-09-30 07:20:21.998 2 DEBUG nova.virt.libvirt.vif [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T07:20:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-194258953',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-194258953',id=9,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5914ab8585ff4a26a783d58aae38b75d',ramdisk_id='',reservation_id='r-cwou451d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteBasicStrategy-698431161',owner_user_name='tempest-TestExecuteBasicStrategy-698431161-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T07:20:15Z,user_data=None,user_id='3a4b8ff28f3345afb27f6afbb0a20f3b',uuid=7951b572-4bd4-472b-99e6-32d37b2ea3fd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1624cd02-73d5-4555-b8de-b38f00887c31", "address": "fa:16:3e:45:04:3d", "network": {"id": "8bd4c178-e5a2-4919-a1df-9c84df6c5788", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-389279842-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d196370b58a64910bc1103fb42505b15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1624cd02-73", "ovs_interfaceid": "1624cd02-73d5-4555-b8de-b38f00887c31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 07:20:21 compute-0 nova_compute[189265]: 2025-09-30 07:20:21.998 2 DEBUG nova.network.os_vif_util [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Converting VIF {"id": "1624cd02-73d5-4555-b8de-b38f00887c31", "address": "fa:16:3e:45:04:3d", "network": {"id": "8bd4c178-e5a2-4919-a1df-9c84df6c5788", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-389279842-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d196370b58a64910bc1103fb42505b15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1624cd02-73", "ovs_interfaceid": "1624cd02-73d5-4555-b8de-b38f00887c31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:20:22 compute-0 nova_compute[189265]: 2025-09-30 07:20:21.999 2 DEBUG nova.network.os_vif_util [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:04:3d,bridge_name='br-int',has_traffic_filtering=True,id=1624cd02-73d5-4555-b8de-b38f00887c31,network=Network(8bd4c178-e5a2-4919-a1df-9c84df6c5788),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1624cd02-73') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:20:22 compute-0 nova_compute[189265]: 2025-09-30 07:20:22.001 2 DEBUG nova.objects.instance [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Lazy-loading 'pci_devices' on Instance uuid 7951b572-4bd4-472b-99e6-32d37b2ea3fd obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:20:22 compute-0 nova_compute[189265]: 2025-09-30 07:20:22.509 2 DEBUG nova.virt.libvirt.driver [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] End _get_guest_xml xml=<domain type="kvm">
Sep 30 07:20:22 compute-0 nova_compute[189265]:   <uuid>7951b572-4bd4-472b-99e6-32d37b2ea3fd</uuid>
Sep 30 07:20:22 compute-0 nova_compute[189265]:   <name>instance-00000009</name>
Sep 30 07:20:22 compute-0 nova_compute[189265]:   <memory>131072</memory>
Sep 30 07:20:22 compute-0 nova_compute[189265]:   <vcpu>1</vcpu>
Sep 30 07:20:22 compute-0 nova_compute[189265]:   <metadata>
Sep 30 07:20:22 compute-0 nova_compute[189265]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 07:20:22 compute-0 nova_compute[189265]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:       <nova:name>tempest-TestExecuteBasicStrategy-server-194258953</nova:name>
Sep 30 07:20:22 compute-0 nova_compute[189265]:       <nova:creationTime>2025-09-30 07:20:21</nova:creationTime>
Sep 30 07:20:22 compute-0 nova_compute[189265]:       <nova:flavor name="m1.nano" id="ded17455-f8fe-40c7-8dae-6f0a2b208ae0">
Sep 30 07:20:22 compute-0 nova_compute[189265]:         <nova:memory>128</nova:memory>
Sep 30 07:20:22 compute-0 nova_compute[189265]:         <nova:disk>1</nova:disk>
Sep 30 07:20:22 compute-0 nova_compute[189265]:         <nova:swap>0</nova:swap>
Sep 30 07:20:22 compute-0 nova_compute[189265]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 07:20:22 compute-0 nova_compute[189265]:         <nova:vcpus>1</nova:vcpus>
Sep 30 07:20:22 compute-0 nova_compute[189265]:         <nova:extraSpecs>
Sep 30 07:20:22 compute-0 nova_compute[189265]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 07:20:22 compute-0 nova_compute[189265]:         </nova:extraSpecs>
Sep 30 07:20:22 compute-0 nova_compute[189265]:       </nova:flavor>
Sep 30 07:20:22 compute-0 nova_compute[189265]:       <nova:image uuid="0c6b92f5-9861-49e4-862d-3ffd84520dfa">
Sep 30 07:20:22 compute-0 nova_compute[189265]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 07:20:22 compute-0 nova_compute[189265]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 07:20:22 compute-0 nova_compute[189265]:         <nova:minDisk>1</nova:minDisk>
Sep 30 07:20:22 compute-0 nova_compute[189265]:         <nova:minRam>0</nova:minRam>
Sep 30 07:20:22 compute-0 nova_compute[189265]:         <nova:properties>
Sep 30 07:20:22 compute-0 nova_compute[189265]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 07:20:22 compute-0 nova_compute[189265]:         </nova:properties>
Sep 30 07:20:22 compute-0 nova_compute[189265]:       </nova:image>
Sep 30 07:20:22 compute-0 nova_compute[189265]:       <nova:owner>
Sep 30 07:20:22 compute-0 nova_compute[189265]:         <nova:user uuid="3a4b8ff28f3345afb27f6afbb0a20f3b">tempest-TestExecuteBasicStrategy-698431161-project-admin</nova:user>
Sep 30 07:20:22 compute-0 nova_compute[189265]:         <nova:project uuid="5914ab8585ff4a26a783d58aae38b75d">tempest-TestExecuteBasicStrategy-698431161</nova:project>
Sep 30 07:20:22 compute-0 nova_compute[189265]:       </nova:owner>
Sep 30 07:20:22 compute-0 nova_compute[189265]:       <nova:root type="image" uuid="0c6b92f5-9861-49e4-862d-3ffd84520dfa"/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:       <nova:ports>
Sep 30 07:20:22 compute-0 nova_compute[189265]:         <nova:port uuid="1624cd02-73d5-4555-b8de-b38f00887c31">
Sep 30 07:20:22 compute-0 nova_compute[189265]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:         </nova:port>
Sep 30 07:20:22 compute-0 nova_compute[189265]:       </nova:ports>
Sep 30 07:20:22 compute-0 nova_compute[189265]:     </nova:instance>
Sep 30 07:20:22 compute-0 nova_compute[189265]:   </metadata>
Sep 30 07:20:22 compute-0 nova_compute[189265]:   <sysinfo type="smbios">
Sep 30 07:20:22 compute-0 nova_compute[189265]:     <system>
Sep 30 07:20:22 compute-0 nova_compute[189265]:       <entry name="manufacturer">RDO</entry>
Sep 30 07:20:22 compute-0 nova_compute[189265]:       <entry name="product">OpenStack Compute</entry>
Sep 30 07:20:22 compute-0 nova_compute[189265]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 07:20:22 compute-0 nova_compute[189265]:       <entry name="serial">7951b572-4bd4-472b-99e6-32d37b2ea3fd</entry>
Sep 30 07:20:22 compute-0 nova_compute[189265]:       <entry name="uuid">7951b572-4bd4-472b-99e6-32d37b2ea3fd</entry>
Sep 30 07:20:22 compute-0 nova_compute[189265]:       <entry name="family">Virtual Machine</entry>
Sep 30 07:20:22 compute-0 nova_compute[189265]:     </system>
Sep 30 07:20:22 compute-0 nova_compute[189265]:   </sysinfo>
Sep 30 07:20:22 compute-0 nova_compute[189265]:   <os>
Sep 30 07:20:22 compute-0 nova_compute[189265]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 07:20:22 compute-0 nova_compute[189265]:     <boot dev="hd"/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:     <smbios mode="sysinfo"/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:   </os>
Sep 30 07:20:22 compute-0 nova_compute[189265]:   <features>
Sep 30 07:20:22 compute-0 nova_compute[189265]:     <acpi/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:     <apic/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:     <vmcoreinfo/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:   </features>
Sep 30 07:20:22 compute-0 nova_compute[189265]:   <clock offset="utc">
Sep 30 07:20:22 compute-0 nova_compute[189265]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:     <timer name="hpet" present="no"/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:   </clock>
Sep 30 07:20:22 compute-0 nova_compute[189265]:   <cpu mode="host-model" match="exact">
Sep 30 07:20:22 compute-0 nova_compute[189265]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:   </cpu>
Sep 30 07:20:22 compute-0 nova_compute[189265]:   <devices>
Sep 30 07:20:22 compute-0 nova_compute[189265]:     <disk type="file" device="disk">
Sep 30 07:20:22 compute-0 nova_compute[189265]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:       <source file="/var/lib/nova/instances/7951b572-4bd4-472b-99e6-32d37b2ea3fd/disk"/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:       <target dev="vda" bus="virtio"/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:     </disk>
Sep 30 07:20:22 compute-0 nova_compute[189265]:     <disk type="file" device="cdrom">
Sep 30 07:20:22 compute-0 nova_compute[189265]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:       <source file="/var/lib/nova/instances/7951b572-4bd4-472b-99e6-32d37b2ea3fd/disk.config"/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:       <target dev="sda" bus="sata"/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:     </disk>
Sep 30 07:20:22 compute-0 nova_compute[189265]:     <interface type="ethernet">
Sep 30 07:20:22 compute-0 nova_compute[189265]:       <mac address="fa:16:3e:45:04:3d"/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:       <model type="virtio"/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:       <mtu size="1442"/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:       <target dev="tap1624cd02-73"/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:     </interface>
Sep 30 07:20:22 compute-0 nova_compute[189265]:     <serial type="pty">
Sep 30 07:20:22 compute-0 nova_compute[189265]:       <log file="/var/lib/nova/instances/7951b572-4bd4-472b-99e6-32d37b2ea3fd/console.log" append="off"/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:     </serial>
Sep 30 07:20:22 compute-0 nova_compute[189265]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:     <video>
Sep 30 07:20:22 compute-0 nova_compute[189265]:       <model type="virtio"/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:     </video>
Sep 30 07:20:22 compute-0 nova_compute[189265]:     <input type="tablet" bus="usb"/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:     <rng model="virtio">
Sep 30 07:20:22 compute-0 nova_compute[189265]:       <backend model="random">/dev/urandom</backend>
Sep 30 07:20:22 compute-0 nova_compute[189265]:     </rng>
Sep 30 07:20:22 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root"/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:     <controller type="usb" index="0"/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 07:20:22 compute-0 nova_compute[189265]:       <stats period="10"/>
Sep 30 07:20:22 compute-0 nova_compute[189265]:     </memballoon>
Sep 30 07:20:22 compute-0 nova_compute[189265]:   </devices>
Sep 30 07:20:22 compute-0 nova_compute[189265]: </domain>
Sep 30 07:20:22 compute-0 nova_compute[189265]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Sep 30 07:20:22 compute-0 nova_compute[189265]: 2025-09-30 07:20:22.510 2 DEBUG nova.compute.manager [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Preparing to wait for external event network-vif-plugged-1624cd02-73d5-4555-b8de-b38f00887c31 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 07:20:22 compute-0 nova_compute[189265]: 2025-09-30 07:20:22.511 2 DEBUG oslo_concurrency.lockutils [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Acquiring lock "7951b572-4bd4-472b-99e6-32d37b2ea3fd-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:20:22 compute-0 nova_compute[189265]: 2025-09-30 07:20:22.512 2 DEBUG oslo_concurrency.lockutils [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Lock "7951b572-4bd4-472b-99e6-32d37b2ea3fd-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:20:22 compute-0 nova_compute[189265]: 2025-09-30 07:20:22.512 2 DEBUG oslo_concurrency.lockutils [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Lock "7951b572-4bd4-472b-99e6-32d37b2ea3fd-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:20:22 compute-0 nova_compute[189265]: 2025-09-30 07:20:22.514 2 DEBUG nova.virt.libvirt.vif [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T07:20:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-194258953',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-194258953',id=9,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5914ab8585ff4a26a783d58aae38b75d',ramdisk_id='',reservation_id='r-cwou451d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteBasicStrategy-698431161',owner_user_name='tempest-TestExecuteBasicStrategy-698431161-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T07:20:15Z,user_data=None,user_id='3a4b8ff28f3345afb27f6afbb0a20f3b',uuid=7951b572-4bd4-472b-99e6-32d37b2ea3fd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1624cd02-73d5-4555-b8de-b38f00887c31", "address": "fa:16:3e:45:04:3d", "network": {"id": "8bd4c178-e5a2-4919-a1df-9c84df6c5788", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-389279842-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d196370b58a64910bc1103fb42505b15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1624cd02-73", "ovs_interfaceid": "1624cd02-73d5-4555-b8de-b38f00887c31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 07:20:22 compute-0 nova_compute[189265]: 2025-09-30 07:20:22.514 2 DEBUG nova.network.os_vif_util [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Converting VIF {"id": "1624cd02-73d5-4555-b8de-b38f00887c31", "address": "fa:16:3e:45:04:3d", "network": {"id": "8bd4c178-e5a2-4919-a1df-9c84df6c5788", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-389279842-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d196370b58a64910bc1103fb42505b15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1624cd02-73", "ovs_interfaceid": "1624cd02-73d5-4555-b8de-b38f00887c31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:20:22 compute-0 nova_compute[189265]: 2025-09-30 07:20:22.515 2 DEBUG nova.network.os_vif_util [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:04:3d,bridge_name='br-int',has_traffic_filtering=True,id=1624cd02-73d5-4555-b8de-b38f00887c31,network=Network(8bd4c178-e5a2-4919-a1df-9c84df6c5788),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1624cd02-73') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:20:22 compute-0 nova_compute[189265]: 2025-09-30 07:20:22.516 2 DEBUG os_vif [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:04:3d,bridge_name='br-int',has_traffic_filtering=True,id=1624cd02-73d5-4555-b8de-b38f00887c31,network=Network(8bd4c178-e5a2-4919-a1df-9c84df6c5788),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1624cd02-73') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 07:20:22 compute-0 nova_compute[189265]: 2025-09-30 07:20:22.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:20:22 compute-0 nova_compute[189265]: 2025-09-30 07:20:22.518 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:20:22 compute-0 nova_compute[189265]: 2025-09-30 07:20:22.518 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:20:22 compute-0 nova_compute[189265]: 2025-09-30 07:20:22.519 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:20:22 compute-0 nova_compute[189265]: 2025-09-30 07:20:22.520 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '84ca0ef2-bff8-5315-8fd6-9c716a369b5a', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:20:22 compute-0 nova_compute[189265]: 2025-09-30 07:20:22.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:20:22 compute-0 nova_compute[189265]: 2025-09-30 07:20:22.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:20:22 compute-0 nova_compute[189265]: 2025-09-30 07:20:22.528 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:20:22 compute-0 nova_compute[189265]: 2025-09-30 07:20:22.528 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1624cd02-73, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:20:22 compute-0 nova_compute[189265]: 2025-09-30 07:20:22.529 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap1624cd02-73, col_values=(('qos', UUID('0f1a0a96-9aae-4d89-8bbe-213b1055d993')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:20:22 compute-0 nova_compute[189265]: 2025-09-30 07:20:22.529 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap1624cd02-73, col_values=(('external_ids', {'iface-id': '1624cd02-73d5-4555-b8de-b38f00887c31', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:45:04:3d', 'vm-uuid': '7951b572-4bd4-472b-99e6-32d37b2ea3fd'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:20:22 compute-0 nova_compute[189265]: 2025-09-30 07:20:22.531 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:20:22 compute-0 NetworkManager[51813]: <info>  [1759216822.5326] manager: (tap1624cd02-73): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/35)
Sep 30 07:20:22 compute-0 nova_compute[189265]: 2025-09-30 07:20:22.535 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:20:22 compute-0 nova_compute[189265]: 2025-09-30 07:20:22.539 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:20:22 compute-0 nova_compute[189265]: 2025-09-30 07:20:22.541 2 INFO os_vif [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:04:3d,bridge_name='br-int',has_traffic_filtering=True,id=1624cd02-73d5-4555-b8de-b38f00887c31,network=Network(8bd4c178-e5a2-4919-a1df-9c84df6c5788),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1624cd02-73')
Sep 30 07:20:24 compute-0 nova_compute[189265]: 2025-09-30 07:20:24.102 2 DEBUG nova.virt.libvirt.driver [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 07:20:24 compute-0 nova_compute[189265]: 2025-09-30 07:20:24.103 2 DEBUG nova.virt.libvirt.driver [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 07:20:24 compute-0 nova_compute[189265]: 2025-09-30 07:20:24.103 2 DEBUG nova.virt.libvirt.driver [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] No VIF found with MAC fa:16:3e:45:04:3d, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 07:20:24 compute-0 nova_compute[189265]: 2025-09-30 07:20:24.104 2 INFO nova.virt.libvirt.driver [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Using config drive
Sep 30 07:20:24 compute-0 podman[214902]: 2025-09-30 07:20:24.488659804 +0000 UTC m=+0.058031936 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 07:20:24 compute-0 podman[214903]: 2025-09-30 07:20:24.535101331 +0000 UTC m=+0.109849980 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 07:20:24 compute-0 podman[214901]: 2025-09-30 07:20:24.535963926 +0000 UTC m=+0.110701475 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Sep 30 07:20:24 compute-0 nova_compute[189265]: 2025-09-30 07:20:24.615 2 WARNING neutronclient.v2_0.client [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:20:25 compute-0 nova_compute[189265]: 2025-09-30 07:20:25.243 2 INFO nova.virt.libvirt.driver [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Creating config drive at /var/lib/nova/instances/7951b572-4bd4-472b-99e6-32d37b2ea3fd/disk.config
Sep 30 07:20:25 compute-0 nova_compute[189265]: 2025-09-30 07:20:25.254 2 DEBUG oslo_concurrency.processutils [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7951b572-4bd4-472b-99e6-32d37b2ea3fd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpo2alv2x5 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:20:25 compute-0 nova_compute[189265]: 2025-09-30 07:20:25.385 2 DEBUG oslo_concurrency.processutils [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7951b572-4bd4-472b-99e6-32d37b2ea3fd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpo2alv2x5" returned: 0 in 0.131s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:20:25 compute-0 kernel: tap1624cd02-73: entered promiscuous mode
Sep 30 07:20:25 compute-0 NetworkManager[51813]: <info>  [1759216825.4714] manager: (tap1624cd02-73): new Tun device (/org/freedesktop/NetworkManager/Devices/36)
Sep 30 07:20:25 compute-0 ovn_controller[91436]: 2025-09-30T07:20:25Z|00089|binding|INFO|Claiming lport 1624cd02-73d5-4555-b8de-b38f00887c31 for this chassis.
Sep 30 07:20:25 compute-0 ovn_controller[91436]: 2025-09-30T07:20:25Z|00090|binding|INFO|1624cd02-73d5-4555-b8de-b38f00887c31: Claiming fa:16:3e:45:04:3d 10.100.0.6
Sep 30 07:20:25 compute-0 nova_compute[189265]: 2025-09-30 07:20:25.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:20:25 compute-0 nova_compute[189265]: 2025-09-30 07:20:25.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:20:25 compute-0 nova_compute[189265]: 2025-09-30 07:20:25.480 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:20:25 compute-0 systemd-udevd[214979]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:20:25.497 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:45:04:3d 10.100.0.6'], port_security=['fa:16:3e:45:04:3d 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '7951b572-4bd4-472b-99e6-32d37b2ea3fd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8bd4c178-e5a2-4919-a1df-9c84df6c5788', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5914ab8585ff4a26a783d58aae38b75d', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cb035a5e-818f-42f5-b01b-9cd518a289c3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0a8dcf8-2d25-4318-a682-128f51c53fdc, chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], logical_port=1624cd02-73d5-4555-b8de-b38f00887c31) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:20:25.499 100322 INFO neutron.agent.ovn.metadata.agent [-] Port 1624cd02-73d5-4555-b8de-b38f00887c31 in datapath 8bd4c178-e5a2-4919-a1df-9c84df6c5788 bound to our chassis
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:20:25.501 100322 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8bd4c178-e5a2-4919-a1df-9c84df6c5788
Sep 30 07:20:25 compute-0 NetworkManager[51813]: <info>  [1759216825.5080] device (tap1624cd02-73): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 07:20:25 compute-0 NetworkManager[51813]: <info>  [1759216825.5086] device (tap1624cd02-73): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:20:25.516 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[e8c6d805-4999-44e9-bc37-e22e8e8a273d]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:20:25.517 100322 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8bd4c178-e1 in ovnmeta-8bd4c178-e5a2-4919-a1df-9c84df6c5788 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:20:25.520 210650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8bd4c178-e0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:20:25.520 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[9faefaea-58fd-45f2-a402-aa04979b657c]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:20:25.521 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[c5834d8b-ba7f-4fa7-89ba-4e59dd72054a]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:20:25 compute-0 systemd-machined[149233]: New machine qemu-6-instance-00000009.
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:20:25.539 100440 DEBUG oslo.privsep.daemon [-] privsep: reply[b8865dd2-ab8f-4b6e-9bff-8414f6647b1e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:20:25 compute-0 nova_compute[189265]: 2025-09-30 07:20:25.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:20:25 compute-0 systemd[1]: Started Virtual Machine qemu-6-instance-00000009.
Sep 30 07:20:25 compute-0 ovn_controller[91436]: 2025-09-30T07:20:25Z|00091|binding|INFO|Setting lport 1624cd02-73d5-4555-b8de-b38f00887c31 ovn-installed in OVS
Sep 30 07:20:25 compute-0 ovn_controller[91436]: 2025-09-30T07:20:25Z|00092|binding|INFO|Setting lport 1624cd02-73d5-4555-b8de-b38f00887c31 up in Southbound
Sep 30 07:20:25 compute-0 nova_compute[189265]: 2025-09-30 07:20:25.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:20:25.561 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[884f6555-eec8-4001-b7c2-219bd842e138]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:20:25.594 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[4dd5e918-88d4-4f14-8693-d59fae15ce0e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:20:25 compute-0 NetworkManager[51813]: <info>  [1759216825.5995] manager: (tap8bd4c178-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/37)
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:20:25.600 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[df8894c7-49b3-49c7-bee4-283ac700a80e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:20:25.639 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[9d2d56b2-e833-4713-8247-b573ffdd33c5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:20:25.642 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[db73600b-0ee0-4d27-95d5-e1b1b6646f89]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:20:25 compute-0 NetworkManager[51813]: <info>  [1759216825.6663] device (tap8bd4c178-e0): carrier: link connected
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:20:25.676 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[e2ef04dc-5195-4a22-872e-e280214af8cd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:20:25.698 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[7297f6ad-0491-4434-8554-5d160589af5b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8bd4c178-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3d:e8:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 25], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 468338, 'reachable_time': 39620, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215015, 'error': None, 'target': 'ovnmeta-8bd4c178-e5a2-4919-a1df-9c84df6c5788', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:20:25.718 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[9aeb34f3-5d69-4d4b-8bc1-f5a9e99a6803]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe3d:e811'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 468338, 'tstamp': 468338}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215016, 'error': None, 'target': 'ovnmeta-8bd4c178-e5a2-4919-a1df-9c84df6c5788', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:20:25.739 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[f194b13b-9c78-4548-adce-2d28c7474249]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8bd4c178-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3d:e8:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 25], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 468338, 'reachable_time': 39620, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 168, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 168, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215017, 'error': None, 'target': 'ovnmeta-8bd4c178-e5a2-4919-a1df-9c84df6c5788', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:20:25 compute-0 nova_compute[189265]: 2025-09-30 07:20:25.776 2 DEBUG nova.compute.manager [req-5a5c8fe8-c417-4f70-9564-7c503e0cdb0d req-d79a4960-d005-471c-8e32-e6821b49f8f9 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Received event network-vif-plugged-1624cd02-73d5-4555-b8de-b38f00887c31 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:20:25.778 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[0764ecda-efef-463a-937c-33f86dddf8fa]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:20:25 compute-0 nova_compute[189265]: 2025-09-30 07:20:25.776 2 DEBUG oslo_concurrency.lockutils [req-5a5c8fe8-c417-4f70-9564-7c503e0cdb0d req-d79a4960-d005-471c-8e32-e6821b49f8f9 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "7951b572-4bd4-472b-99e6-32d37b2ea3fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:20:25 compute-0 nova_compute[189265]: 2025-09-30 07:20:25.780 2 DEBUG oslo_concurrency.lockutils [req-5a5c8fe8-c417-4f70-9564-7c503e0cdb0d req-d79a4960-d005-471c-8e32-e6821b49f8f9 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "7951b572-4bd4-472b-99e6-32d37b2ea3fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.003s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:20:25 compute-0 nova_compute[189265]: 2025-09-30 07:20:25.780 2 DEBUG oslo_concurrency.lockutils [req-5a5c8fe8-c417-4f70-9564-7c503e0cdb0d req-d79a4960-d005-471c-8e32-e6821b49f8f9 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "7951b572-4bd4-472b-99e6-32d37b2ea3fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:20:25 compute-0 nova_compute[189265]: 2025-09-30 07:20:25.780 2 DEBUG nova.compute.manager [req-5a5c8fe8-c417-4f70-9564-7c503e0cdb0d req-d79a4960-d005-471c-8e32-e6821b49f8f9 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Processing event network-vif-plugged-1624cd02-73d5-4555-b8de-b38f00887c31 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:20:25.853 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[d9603764-3ca3-4b4c-82ca-6f84b93b42b5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:20:25.854 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8bd4c178-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:20:25.854 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:20:25.855 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8bd4c178-e0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:20:25 compute-0 nova_compute[189265]: 2025-09-30 07:20:25.856 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:20:25 compute-0 NetworkManager[51813]: <info>  [1759216825.8577] manager: (tap8bd4c178-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/38)
Sep 30 07:20:25 compute-0 kernel: tap8bd4c178-e0: entered promiscuous mode
Sep 30 07:20:25 compute-0 nova_compute[189265]: 2025-09-30 07:20:25.860 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:20:25.861 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8bd4c178-e0, col_values=(('external_ids', {'iface-id': 'f8042a54-712a-444f-8425-c77f4087b994'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:20:25 compute-0 ovn_controller[91436]: 2025-09-30T07:20:25Z|00093|binding|INFO|Releasing lport f8042a54-712a-444f-8425-c77f4087b994 from this chassis (sb_readonly=0)
Sep 30 07:20:25 compute-0 nova_compute[189265]: 2025-09-30 07:20:25.886 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:20:25.887 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[c58aef9d-aa2a-47bf-9483-bceb731ba4c4]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:20:25.888 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8bd4c178-e5a2-4919-a1df-9c84df6c5788.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8bd4c178-e5a2-4919-a1df-9c84df6c5788.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:20:25.889 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8bd4c178-e5a2-4919-a1df-9c84df6c5788.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8bd4c178-e5a2-4919-a1df-9c84df6c5788.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:20:25.889 100322 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 8bd4c178-e5a2-4919-a1df-9c84df6c5788 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:20:25.889 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8bd4c178-e5a2-4919-a1df-9c84df6c5788.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8bd4c178-e5a2-4919-a1df-9c84df6c5788.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:20:25.890 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[fe90a513-a31e-481b-ae4e-8f007d92a901]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:20:25.891 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8bd4c178-e5a2-4919-a1df-9c84df6c5788.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8bd4c178-e5a2-4919-a1df-9c84df6c5788.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:20:25.891 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[d1b243fa-7572-42aa-9dd6-4c190f981a29]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:20:25.892 100322 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]: global
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]:     log         /dev/log local0 debug
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]:     log-tag     haproxy-metadata-proxy-8bd4c178-e5a2-4919-a1df-9c84df6c5788
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]:     user        root
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]:     group       root
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]:     maxconn     1024
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]:     pidfile     /var/lib/neutron/external/pids/8bd4c178-e5a2-4919-a1df-9c84df6c5788.pid.haproxy
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]:     daemon
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]: 
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]: defaults
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]:     log global
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]:     mode http
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]:     option httplog
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]:     option dontlognull
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]:     option http-server-close
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]:     option forwardfor
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]:     retries                 3
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]:     timeout http-request    30s
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]:     timeout connect         30s
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]:     timeout client          32s
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]:     timeout server          32s
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]:     timeout http-keep-alive 30s
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]: 
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]: listen listener
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]:     bind 169.254.169.254:80
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]:     
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]: 
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]:     http-request add-header X-OVN-Network-ID 8bd4c178-e5a2-4919-a1df-9c84df6c5788
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Sep 30 07:20:25 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:20:25.893 100322 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8bd4c178-e5a2-4919-a1df-9c84df6c5788', 'env', 'PROCESS_TAG=haproxy-8bd4c178-e5a2-4919-a1df-9c84df6c5788', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8bd4c178-e5a2-4919-a1df-9c84df6c5788.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Sep 30 07:20:25 compute-0 nova_compute[189265]: 2025-09-30 07:20:25.898 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:20:26 compute-0 podman[215056]: 2025-09-30 07:20:26.351684311 +0000 UTC m=+0.080414031 container create 102dda15f5ba0439dcce223e03fae3978ccc953b4fce5f4c56d813f134bda202 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-8bd4c178-e5a2-4919-a1df-9c84df6c5788, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS)
Sep 30 07:20:26 compute-0 systemd[1]: Started libpod-conmon-102dda15f5ba0439dcce223e03fae3978ccc953b4fce5f4c56d813f134bda202.scope.
Sep 30 07:20:26 compute-0 podman[215056]: 2025-09-30 07:20:26.313686961 +0000 UTC m=+0.042416731 image pull eeebcc09bc72f81ab45f5ab87eb8f6a7b554b949227aeec082bdb0732754ddc8 38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 07:20:26 compute-0 systemd[1]: Started libcrun container.
Sep 30 07:20:26 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4474212ac8d7a0560e5f2d3fe7363a6d5192ac9f3449389f064a9904920cb44e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 07:20:26 compute-0 nova_compute[189265]: 2025-09-30 07:20:26.437 2 DEBUG nova.compute.manager [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 07:20:26 compute-0 nova_compute[189265]: 2025-09-30 07:20:26.442 2 DEBUG nova.virt.libvirt.driver [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Sep 30 07:20:26 compute-0 podman[215056]: 2025-09-30 07:20:26.443856295 +0000 UTC m=+0.172586025 container init 102dda15f5ba0439dcce223e03fae3978ccc953b4fce5f4c56d813f134bda202 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-8bd4c178-e5a2-4919-a1df-9c84df6c5788, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Sep 30 07:20:26 compute-0 nova_compute[189265]: 2025-09-30 07:20:26.445 2 INFO nova.virt.libvirt.driver [-] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Instance spawned successfully.
Sep 30 07:20:26 compute-0 nova_compute[189265]: 2025-09-30 07:20:26.446 2 DEBUG nova.virt.libvirt.driver [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Sep 30 07:20:26 compute-0 podman[215056]: 2025-09-30 07:20:26.450735296 +0000 UTC m=+0.179464986 container start 102dda15f5ba0439dcce223e03fae3978ccc953b4fce5f4c56d813f134bda202 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-8bd4c178-e5a2-4919-a1df-9c84df6c5788, tcib_managed=true, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4)
Sep 30 07:20:26 compute-0 neutron-haproxy-ovnmeta-8bd4c178-e5a2-4919-a1df-9c84df6c5788[215071]: [NOTICE]   (215075) : New worker (215077) forked
Sep 30 07:20:26 compute-0 neutron-haproxy-ovnmeta-8bd4c178-e5a2-4919-a1df-9c84df6c5788[215071]: [NOTICE]   (215075) : Loading success.
Sep 30 07:20:26 compute-0 nova_compute[189265]: 2025-09-30 07:20:26.957 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:20:26 compute-0 nova_compute[189265]: 2025-09-30 07:20:26.960 2 DEBUG nova.virt.libvirt.driver [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:20:26 compute-0 nova_compute[189265]: 2025-09-30 07:20:26.961 2 DEBUG nova.virt.libvirt.driver [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:20:26 compute-0 nova_compute[189265]: 2025-09-30 07:20:26.961 2 DEBUG nova.virt.libvirt.driver [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:20:26 compute-0 nova_compute[189265]: 2025-09-30 07:20:26.962 2 DEBUG nova.virt.libvirt.driver [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:20:26 compute-0 nova_compute[189265]: 2025-09-30 07:20:26.962 2 DEBUG nova.virt.libvirt.driver [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:20:26 compute-0 nova_compute[189265]: 2025-09-30 07:20:26.962 2 DEBUG nova.virt.libvirt.driver [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:20:27 compute-0 nova_compute[189265]: 2025-09-30 07:20:27.471 2 INFO nova.compute.manager [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Took 10.86 seconds to spawn the instance on the hypervisor.
Sep 30 07:20:27 compute-0 nova_compute[189265]: 2025-09-30 07:20:27.473 2 DEBUG nova.compute.manager [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 07:20:27 compute-0 nova_compute[189265]: 2025-09-30 07:20:27.571 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:20:27 compute-0 nova_compute[189265]: 2025-09-30 07:20:27.842 2 DEBUG nova.compute.manager [req-b0780e56-32f5-44f1-9130-e5f155744cbb req-8feec08c-3043-4816-96ae-58d235be2354 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Received event network-vif-plugged-1624cd02-73d5-4555-b8de-b38f00887c31 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:20:27 compute-0 nova_compute[189265]: 2025-09-30 07:20:27.845 2 DEBUG oslo_concurrency.lockutils [req-b0780e56-32f5-44f1-9130-e5f155744cbb req-8feec08c-3043-4816-96ae-58d235be2354 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "7951b572-4bd4-472b-99e6-32d37b2ea3fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:20:27 compute-0 nova_compute[189265]: 2025-09-30 07:20:27.846 2 DEBUG oslo_concurrency.lockutils [req-b0780e56-32f5-44f1-9130-e5f155744cbb req-8feec08c-3043-4816-96ae-58d235be2354 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "7951b572-4bd4-472b-99e6-32d37b2ea3fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:20:27 compute-0 nova_compute[189265]: 2025-09-30 07:20:27.847 2 DEBUG oslo_concurrency.lockutils [req-b0780e56-32f5-44f1-9130-e5f155744cbb req-8feec08c-3043-4816-96ae-58d235be2354 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "7951b572-4bd4-472b-99e6-32d37b2ea3fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:20:27 compute-0 nova_compute[189265]: 2025-09-30 07:20:27.847 2 DEBUG nova.compute.manager [req-b0780e56-32f5-44f1-9130-e5f155744cbb req-8feec08c-3043-4816-96ae-58d235be2354 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] No waiting events found dispatching network-vif-plugged-1624cd02-73d5-4555-b8de-b38f00887c31 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:20:27 compute-0 nova_compute[189265]: 2025-09-30 07:20:27.848 2 WARNING nova.compute.manager [req-b0780e56-32f5-44f1-9130-e5f155744cbb req-8feec08c-3043-4816-96ae-58d235be2354 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Received unexpected event network-vif-plugged-1624cd02-73d5-4555-b8de-b38f00887c31 for instance with vm_state active and task_state None.
Sep 30 07:20:28 compute-0 nova_compute[189265]: 2025-09-30 07:20:28.008 2 INFO nova.compute.manager [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Took 16.12 seconds to build instance.
Sep 30 07:20:28 compute-0 nova_compute[189265]: 2025-09-30 07:20:28.514 2 DEBUG oslo_concurrency.lockutils [None req-b022a0c9-17e6-4ce7-a64a-52701b925d2f 3a4b8ff28f3345afb27f6afbb0a20f3b 5914ab8585ff4a26a783d58aae38b75d - - default default] Lock "7951b572-4bd4-472b-99e6-32d37b2ea3fd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.642s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:20:29 compute-0 podman[199733]: time="2025-09-30T07:20:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:20:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:20:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 07:20:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:20:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3459 "" "Go-http-client/1.1"
Sep 30 07:20:31 compute-0 openstack_network_exporter[201859]: ERROR   07:20:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:20:31 compute-0 openstack_network_exporter[201859]: ERROR   07:20:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:20:31 compute-0 openstack_network_exporter[201859]: ERROR   07:20:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:20:31 compute-0 openstack_network_exporter[201859]: ERROR   07:20:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:20:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:20:31 compute-0 openstack_network_exporter[201859]: ERROR   07:20:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:20:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:20:31 compute-0 nova_compute[189265]: 2025-09-30 07:20:31.983 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:20:32 compute-0 nova_compute[189265]: 2025-09-30 07:20:32.573 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:20:33 compute-0 nova_compute[189265]: 2025-09-30 07:20:33.784 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:20:33 compute-0 nova_compute[189265]: 2025-09-30 07:20:33.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:20:35 compute-0 nova_compute[189265]: 2025-09-30 07:20:35.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:20:35 compute-0 nova_compute[189265]: 2025-09-30 07:20:35.789 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:20:35 compute-0 nova_compute[189265]: 2025-09-30 07:20:35.789 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:20:36 compute-0 nova_compute[189265]: 2025-09-30 07:20:36.987 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:20:37 compute-0 nova_compute[189265]: 2025-09-30 07:20:37.575 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:20:37 compute-0 ovn_controller[91436]: 2025-09-30T07:20:37Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:45:04:3d 10.100.0.6
Sep 30 07:20:37 compute-0 ovn_controller[91436]: 2025-09-30T07:20:37Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:45:04:3d 10.100.0.6
Sep 30 07:20:38 compute-0 nova_compute[189265]: 2025-09-30 07:20:38.789 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:20:39 compute-0 podman[215095]: 2025-09-30 07:20:39.515207046 +0000 UTC m=+0.082436230 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 07:20:40 compute-0 nova_compute[189265]: 2025-09-30 07:20:40.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:20:40 compute-0 nova_compute[189265]: 2025-09-30 07:20:40.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:20:41 compute-0 nova_compute[189265]: 2025-09-30 07:20:41.305 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:20:41 compute-0 nova_compute[189265]: 2025-09-30 07:20:41.306 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:20:41 compute-0 nova_compute[189265]: 2025-09-30 07:20:41.306 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:20:41 compute-0 nova_compute[189265]: 2025-09-30 07:20:41.306 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:20:41 compute-0 nova_compute[189265]: 2025-09-30 07:20:41.989 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:20:42 compute-0 nova_compute[189265]: 2025-09-30 07:20:42.360 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7951b572-4bd4-472b-99e6-32d37b2ea3fd/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:20:42 compute-0 nova_compute[189265]: 2025-09-30 07:20:42.463 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7951b572-4bd4-472b-99e6-32d37b2ea3fd/disk --force-share --output=json" returned: 0 in 0.104s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:20:42 compute-0 nova_compute[189265]: 2025-09-30 07:20:42.465 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7951b572-4bd4-472b-99e6-32d37b2ea3fd/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:20:42 compute-0 nova_compute[189265]: 2025-09-30 07:20:42.547 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7951b572-4bd4-472b-99e6-32d37b2ea3fd/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:20:42 compute-0 nova_compute[189265]: 2025-09-30 07:20:42.577 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:20:42 compute-0 nova_compute[189265]: 2025-09-30 07:20:42.723 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:20:42 compute-0 nova_compute[189265]: 2025-09-30 07:20:42.725 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:20:42 compute-0 nova_compute[189265]: 2025-09-30 07:20:42.743 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:20:42 compute-0 nova_compute[189265]: 2025-09-30 07:20:42.743 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5656MB free_disk=73.27895736694336GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:20:42 compute-0 nova_compute[189265]: 2025-09-30 07:20:42.744 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:20:42 compute-0 nova_compute[189265]: 2025-09-30 07:20:42.744 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:20:43 compute-0 nova_compute[189265]: 2025-09-30 07:20:43.818 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Instance 7951b572-4bd4-472b-99e6-32d37b2ea3fd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 07:20:43 compute-0 nova_compute[189265]: 2025-09-30 07:20:43.818 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:20:43 compute-0 nova_compute[189265]: 2025-09-30 07:20:43.819 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:20:42 up  1:18,  0 user,  load average: 0.32, 0.37, 0.44\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_5914ab8585ff4a26a783d58aae38b75d': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:20:43 compute-0 nova_compute[189265]: 2025-09-30 07:20:43.857 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:20:44 compute-0 nova_compute[189265]: 2025-09-30 07:20:44.363 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:20:44 compute-0 nova_compute[189265]: 2025-09-30 07:20:44.876 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:20:44 compute-0 nova_compute[189265]: 2025-09-30 07:20:44.877 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.133s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:20:46 compute-0 nova_compute[189265]: 2025-09-30 07:20:46.991 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:20:47 compute-0 nova_compute[189265]: 2025-09-30 07:20:47.579 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:20:48 compute-0 podman[215127]: 2025-09-30 07:20:48.499923665 +0000 UTC m=+0.082288066 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Sep 30 07:20:48 compute-0 nova_compute[189265]: 2025-09-30 07:20:48.877 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:20:51 compute-0 nova_compute[189265]: 2025-09-30 07:20:51.993 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:20:52 compute-0 podman[215147]: 2025-09-30 07:20:52.505982543 +0000 UTC m=+0.084091519 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.buildah.version=1.33.7, managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vcs-type=git, version=9.6, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, io.openshift.expose-services=, config_id=edpm, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc.)
Sep 30 07:20:52 compute-0 nova_compute[189265]: 2025-09-30 07:20:52.581 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:20:54 compute-0 nova_compute[189265]: 2025-09-30 07:20:54.359 2 DEBUG nova.compute.manager [None req-ea95a6d7-1771-48d3-aa4a-025aabf6835b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:635
Sep 30 07:20:54 compute-0 nova_compute[189265]: 2025-09-30 07:20:54.431 2 DEBUG nova.compute.provider_tree [None req-ea95a6d7-1771-48d3-aa4a-025aabf6835b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Updating resource provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc generation from 8 to 10 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Sep 30 07:20:55 compute-0 podman[215169]: 2025-09-30 07:20:55.515549757 +0000 UTC m=+0.096019418 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd)
Sep 30 07:20:55 compute-0 podman[215170]: 2025-09-30 07:20:55.51805325 +0000 UTC m=+0.087879670 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent)
Sep 30 07:20:55 compute-0 podman[215171]: 2025-09-30 07:20:55.53620959 +0000 UTC m=+0.106530384 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller)
Sep 30 07:20:55 compute-0 ovn_controller[91436]: 2025-09-30T07:20:55Z|00094|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Sep 30 07:20:57 compute-0 nova_compute[189265]: 2025-09-30 07:20:57.024 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:20:57 compute-0 nova_compute[189265]: 2025-09-30 07:20:57.583 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:20:59 compute-0 podman[199733]: time="2025-09-30T07:20:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:20:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:20:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 07:20:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:20:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3466 "" "Go-http-client/1.1"
Sep 30 07:21:01 compute-0 openstack_network_exporter[201859]: ERROR   07:21:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:21:01 compute-0 openstack_network_exporter[201859]: ERROR   07:21:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:21:01 compute-0 openstack_network_exporter[201859]: ERROR   07:21:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:21:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:21:01 compute-0 openstack_network_exporter[201859]: ERROR   07:21:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:21:01 compute-0 openstack_network_exporter[201859]: ERROR   07:21:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:21:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:21:02 compute-0 nova_compute[189265]: 2025-09-30 07:21:02.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:21:02 compute-0 nova_compute[189265]: 2025-09-30 07:21:02.585 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:21:02 compute-0 nova_compute[189265]: 2025-09-30 07:21:02.694 2 DEBUG nova.virt.libvirt.driver [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Check if temp file /var/lib/nova/instances/tmpblenez0y exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Sep 30 07:21:02 compute-0 nova_compute[189265]: 2025-09-30 07:21:02.698 2 DEBUG nova.compute.manager [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpblenez0y',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='7951b572-4bd4-472b-99e6-32d37b2ea3fd',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9294
Sep 30 07:21:07 compute-0 nova_compute[189265]: 2025-09-30 07:21:07.066 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:21:07 compute-0 nova_compute[189265]: 2025-09-30 07:21:07.586 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:21:07 compute-0 nova_compute[189265]: 2025-09-30 07:21:07.665 2 DEBUG oslo_concurrency.processutils [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7951b572-4bd4-472b-99e6-32d37b2ea3fd/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:21:07 compute-0 nova_compute[189265]: 2025-09-30 07:21:07.734 2 DEBUG oslo_concurrency.processutils [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7951b572-4bd4-472b-99e6-32d37b2ea3fd/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:21:07 compute-0 nova_compute[189265]: 2025-09-30 07:21:07.736 2 DEBUG oslo_concurrency.processutils [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7951b572-4bd4-472b-99e6-32d37b2ea3fd/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:21:07 compute-0 nova_compute[189265]: 2025-09-30 07:21:07.828 2 DEBUG oslo_concurrency.processutils [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7951b572-4bd4-472b-99e6-32d37b2ea3fd/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:21:07 compute-0 nova_compute[189265]: 2025-09-30 07:21:07.829 2 DEBUG nova.compute.manager [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Preparing to wait for external event network-vif-plugged-1624cd02-73d5-4555-b8de-b38f00887c31 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 07:21:07 compute-0 nova_compute[189265]: 2025-09-30 07:21:07.829 2 DEBUG oslo_concurrency.lockutils [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "7951b572-4bd4-472b-99e6-32d37b2ea3fd-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:21:07 compute-0 nova_compute[189265]: 2025-09-30 07:21:07.830 2 DEBUG oslo_concurrency.lockutils [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "7951b572-4bd4-472b-99e6-32d37b2ea3fd-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:21:07 compute-0 nova_compute[189265]: 2025-09-30 07:21:07.830 2 DEBUG oslo_concurrency.lockutils [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "7951b572-4bd4-472b-99e6-32d37b2ea3fd-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:21:10 compute-0 podman[215245]: 2025-09-30 07:21:10.497082542 +0000 UTC m=+0.078472315 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 07:21:12 compute-0 nova_compute[189265]: 2025-09-30 07:21:12.112 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:21:12 compute-0 nova_compute[189265]: 2025-09-30 07:21:12.589 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:21:13 compute-0 nova_compute[189265]: 2025-09-30 07:21:13.688 2 DEBUG nova.compute.manager [req-ac0c0270-d49b-40b8-8236-54d73ab0089f req-623f17e0-1924-47c2-9b2c-c54e8b5dd458 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Received event network-vif-unplugged-1624cd02-73d5-4555-b8de-b38f00887c31 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:21:13 compute-0 nova_compute[189265]: 2025-09-30 07:21:13.689 2 DEBUG oslo_concurrency.lockutils [req-ac0c0270-d49b-40b8-8236-54d73ab0089f req-623f17e0-1924-47c2-9b2c-c54e8b5dd458 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "7951b572-4bd4-472b-99e6-32d37b2ea3fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:21:13 compute-0 nova_compute[189265]: 2025-09-30 07:21:13.689 2 DEBUG oslo_concurrency.lockutils [req-ac0c0270-d49b-40b8-8236-54d73ab0089f req-623f17e0-1924-47c2-9b2c-c54e8b5dd458 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "7951b572-4bd4-472b-99e6-32d37b2ea3fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:21:13 compute-0 nova_compute[189265]: 2025-09-30 07:21:13.689 2 DEBUG oslo_concurrency.lockutils [req-ac0c0270-d49b-40b8-8236-54d73ab0089f req-623f17e0-1924-47c2-9b2c-c54e8b5dd458 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "7951b572-4bd4-472b-99e6-32d37b2ea3fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:21:13 compute-0 nova_compute[189265]: 2025-09-30 07:21:13.689 2 DEBUG nova.compute.manager [req-ac0c0270-d49b-40b8-8236-54d73ab0089f req-623f17e0-1924-47c2-9b2c-c54e8b5dd458 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] No event matching network-vif-unplugged-1624cd02-73d5-4555-b8de-b38f00887c31 in dict_keys([('network-vif-plugged', '1624cd02-73d5-4555-b8de-b38f00887c31')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:349
Sep 30 07:21:13 compute-0 nova_compute[189265]: 2025-09-30 07:21:13.690 2 DEBUG nova.compute.manager [req-ac0c0270-d49b-40b8-8236-54d73ab0089f req-623f17e0-1924-47c2-9b2c-c54e8b5dd458 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Received event network-vif-unplugged-1624cd02-73d5-4555-b8de-b38f00887c31 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:21:15 compute-0 nova_compute[189265]: 2025-09-30 07:21:15.355 2 INFO nova.compute.manager [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Took 7.52 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Sep 30 07:21:15 compute-0 nova_compute[189265]: 2025-09-30 07:21:15.780 2 DEBUG nova.compute.manager [req-b686fa8b-cf07-4041-a978-de7d69473f78 req-7025017d-6f9b-42be-92f2-6bb6ebc5f5a0 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Received event network-vif-plugged-1624cd02-73d5-4555-b8de-b38f00887c31 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:21:15 compute-0 nova_compute[189265]: 2025-09-30 07:21:15.780 2 DEBUG oslo_concurrency.lockutils [req-b686fa8b-cf07-4041-a978-de7d69473f78 req-7025017d-6f9b-42be-92f2-6bb6ebc5f5a0 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "7951b572-4bd4-472b-99e6-32d37b2ea3fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:21:15 compute-0 nova_compute[189265]: 2025-09-30 07:21:15.781 2 DEBUG oslo_concurrency.lockutils [req-b686fa8b-cf07-4041-a978-de7d69473f78 req-7025017d-6f9b-42be-92f2-6bb6ebc5f5a0 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "7951b572-4bd4-472b-99e6-32d37b2ea3fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:21:15 compute-0 nova_compute[189265]: 2025-09-30 07:21:15.781 2 DEBUG oslo_concurrency.lockutils [req-b686fa8b-cf07-4041-a978-de7d69473f78 req-7025017d-6f9b-42be-92f2-6bb6ebc5f5a0 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "7951b572-4bd4-472b-99e6-32d37b2ea3fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:21:15 compute-0 nova_compute[189265]: 2025-09-30 07:21:15.781 2 DEBUG nova.compute.manager [req-b686fa8b-cf07-4041-a978-de7d69473f78 req-7025017d-6f9b-42be-92f2-6bb6ebc5f5a0 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Processing event network-vif-plugged-1624cd02-73d5-4555-b8de-b38f00887c31 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 07:21:15 compute-0 nova_compute[189265]: 2025-09-30 07:21:15.781 2 DEBUG nova.compute.manager [req-b686fa8b-cf07-4041-a978-de7d69473f78 req-7025017d-6f9b-42be-92f2-6bb6ebc5f5a0 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Received event network-changed-1624cd02-73d5-4555-b8de-b38f00887c31 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:21:15 compute-0 nova_compute[189265]: 2025-09-30 07:21:15.782 2 DEBUG nova.compute.manager [req-b686fa8b-cf07-4041-a978-de7d69473f78 req-7025017d-6f9b-42be-92f2-6bb6ebc5f5a0 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Refreshing instance network info cache due to event network-changed-1624cd02-73d5-4555-b8de-b38f00887c31. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 07:21:15 compute-0 nova_compute[189265]: 2025-09-30 07:21:15.782 2 DEBUG oslo_concurrency.lockutils [req-b686fa8b-cf07-4041-a978-de7d69473f78 req-7025017d-6f9b-42be-92f2-6bb6ebc5f5a0 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "refresh_cache-7951b572-4bd4-472b-99e6-32d37b2ea3fd" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:21:15 compute-0 nova_compute[189265]: 2025-09-30 07:21:15.782 2 DEBUG oslo_concurrency.lockutils [req-b686fa8b-cf07-4041-a978-de7d69473f78 req-7025017d-6f9b-42be-92f2-6bb6ebc5f5a0 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquired lock "refresh_cache-7951b572-4bd4-472b-99e6-32d37b2ea3fd" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:21:15 compute-0 nova_compute[189265]: 2025-09-30 07:21:15.782 2 DEBUG nova.network.neutron [req-b686fa8b-cf07-4041-a978-de7d69473f78 req-7025017d-6f9b-42be-92f2-6bb6ebc5f5a0 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Refreshing network info cache for port 1624cd02-73d5-4555-b8de-b38f00887c31 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 07:21:15 compute-0 nova_compute[189265]: 2025-09-30 07:21:15.784 2 DEBUG nova.compute.manager [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 07:21:16 compute-0 nova_compute[189265]: 2025-09-30 07:21:16.290 2 WARNING neutronclient.v2_0.client [req-b686fa8b-cf07-4041-a978-de7d69473f78 req-7025017d-6f9b-42be-92f2-6bb6ebc5f5a0 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:21:16 compute-0 nova_compute[189265]: 2025-09-30 07:21:16.295 2 DEBUG nova.compute.manager [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpblenez0y',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='7951b572-4bd4-472b-99e6-32d37b2ea3fd',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(2c5292e0-a61d-4dcf-9cf7-809a5920f6b8),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9659
Sep 30 07:21:16 compute-0 nova_compute[189265]: 2025-09-30 07:21:16.798 2 WARNING neutronclient.v2_0.client [req-b686fa8b-cf07-4041-a978-de7d69473f78 req-7025017d-6f9b-42be-92f2-6bb6ebc5f5a0 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:21:16 compute-0 nova_compute[189265]: 2025-09-30 07:21:16.808 2 DEBUG nova.objects.instance [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lazy-loading 'migration_context' on Instance uuid 7951b572-4bd4-472b-99e6-32d37b2ea3fd obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:21:16 compute-0 nova_compute[189265]: 2025-09-30 07:21:16.809 2 DEBUG nova.virt.libvirt.driver [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Sep 30 07:21:16 compute-0 nova_compute[189265]: 2025-09-30 07:21:16.810 2 DEBUG nova.virt.libvirt.driver [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Sep 30 07:21:16 compute-0 nova_compute[189265]: 2025-09-30 07:21:16.811 2 DEBUG nova.virt.libvirt.driver [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Sep 30 07:21:16 compute-0 nova_compute[189265]: 2025-09-30 07:21:16.994 2 DEBUG nova.network.neutron [req-b686fa8b-cf07-4041-a978-de7d69473f78 req-7025017d-6f9b-42be-92f2-6bb6ebc5f5a0 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Updated VIF entry in instance network info cache for port 1624cd02-73d5-4555-b8de-b38f00887c31. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Sep 30 07:21:16 compute-0 nova_compute[189265]: 2025-09-30 07:21:16.994 2 DEBUG nova.network.neutron [req-b686fa8b-cf07-4041-a978-de7d69473f78 req-7025017d-6f9b-42be-92f2-6bb6ebc5f5a0 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Updating instance_info_cache with network_info: [{"id": "1624cd02-73d5-4555-b8de-b38f00887c31", "address": "fa:16:3e:45:04:3d", "network": {"id": "8bd4c178-e5a2-4919-a1df-9c84df6c5788", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-389279842-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d196370b58a64910bc1103fb42505b15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1624cd02-73", "ovs_interfaceid": "1624cd02-73d5-4555-b8de-b38f00887c31", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:21:17 compute-0 nova_compute[189265]: 2025-09-30 07:21:17.161 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:21:17 compute-0 nova_compute[189265]: 2025-09-30 07:21:17.313 2 DEBUG nova.virt.libvirt.driver [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Sep 30 07:21:17 compute-0 nova_compute[189265]: 2025-09-30 07:21:17.314 2 DEBUG nova.virt.libvirt.driver [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Sep 30 07:21:17 compute-0 nova_compute[189265]: 2025-09-30 07:21:17.323 2 DEBUG nova.virt.libvirt.vif [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-09-30T07:20:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-194258953',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-194258953',id=9,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T07:20:27Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5914ab8585ff4a26a783d58aae38b75d',ramdisk_id='',reservation_id='r-cwou451d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-698431161',owner_user_name='tempest-TestExecuteBasicStrategy-698431161-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T07:20:27Z,user_data=None,user_id='3a4b8ff28f3345afb27f6afbb0a20f3b',uuid=7951b572-4bd4-472b-99e6-32d37b2ea3fd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1624cd02-73d5-4555-b8de-b38f00887c31", "address": "fa:16:3e:45:04:3d", "network": {"id": "8bd4c178-e5a2-4919-a1df-9c84df6c5788", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-389279842-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d196370b58a64910bc1103fb42505b15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap1624cd02-73", "ovs_interfaceid": "1624cd02-73d5-4555-b8de-b38f00887c31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 07:21:17 compute-0 nova_compute[189265]: 2025-09-30 07:21:17.323 2 DEBUG nova.network.os_vif_util [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Converting VIF {"id": "1624cd02-73d5-4555-b8de-b38f00887c31", "address": "fa:16:3e:45:04:3d", "network": {"id": "8bd4c178-e5a2-4919-a1df-9c84df6c5788", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-389279842-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d196370b58a64910bc1103fb42505b15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap1624cd02-73", "ovs_interfaceid": "1624cd02-73d5-4555-b8de-b38f00887c31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:21:17 compute-0 nova_compute[189265]: 2025-09-30 07:21:17.324 2 DEBUG nova.network.os_vif_util [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:04:3d,bridge_name='br-int',has_traffic_filtering=True,id=1624cd02-73d5-4555-b8de-b38f00887c31,network=Network(8bd4c178-e5a2-4919-a1df-9c84df6c5788),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1624cd02-73') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:21:17 compute-0 nova_compute[189265]: 2025-09-30 07:21:17.325 2 DEBUG nova.virt.libvirt.migration [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Updating guest XML with vif config: <interface type="ethernet">
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <mac address="fa:16:3e:45:04:3d"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <model type="virtio"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <driver name="vhost" rx_queue_size="512"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <mtu size="1442"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <target dev="tap1624cd02-73"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]: </interface>
Sep 30 07:21:17 compute-0 nova_compute[189265]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Sep 30 07:21:17 compute-0 nova_compute[189265]: 2025-09-30 07:21:17.326 2 DEBUG nova.virt.libvirt.migration [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <name>instance-00000009</name>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <uuid>7951b572-4bd4-472b-99e6-32d37b2ea3fd</uuid>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <metadata>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <nova:name>tempest-TestExecuteBasicStrategy-server-194258953</nova:name>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <nova:creationTime>2025-09-30 07:20:21</nova:creationTime>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <nova:flavor name="m1.nano" id="ded17455-f8fe-40c7-8dae-6f0a2b208ae0">
Sep 30 07:21:17 compute-0 nova_compute[189265]:         <nova:memory>128</nova:memory>
Sep 30 07:21:17 compute-0 nova_compute[189265]:         <nova:disk>1</nova:disk>
Sep 30 07:21:17 compute-0 nova_compute[189265]:         <nova:swap>0</nova:swap>
Sep 30 07:21:17 compute-0 nova_compute[189265]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 07:21:17 compute-0 nova_compute[189265]:         <nova:vcpus>1</nova:vcpus>
Sep 30 07:21:17 compute-0 nova_compute[189265]:         <nova:extraSpecs>
Sep 30 07:21:17 compute-0 nova_compute[189265]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 07:21:17 compute-0 nova_compute[189265]:         </nova:extraSpecs>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       </nova:flavor>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <nova:image uuid="0c6b92f5-9861-49e4-862d-3ffd84520dfa">
Sep 30 07:21:17 compute-0 nova_compute[189265]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 07:21:17 compute-0 nova_compute[189265]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 07:21:17 compute-0 nova_compute[189265]:         <nova:minDisk>1</nova:minDisk>
Sep 30 07:21:17 compute-0 nova_compute[189265]:         <nova:minRam>0</nova:minRam>
Sep 30 07:21:17 compute-0 nova_compute[189265]:         <nova:properties>
Sep 30 07:21:17 compute-0 nova_compute[189265]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 07:21:17 compute-0 nova_compute[189265]:         </nova:properties>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       </nova:image>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <nova:owner>
Sep 30 07:21:17 compute-0 nova_compute[189265]:         <nova:user uuid="3a4b8ff28f3345afb27f6afbb0a20f3b">tempest-TestExecuteBasicStrategy-698431161-project-admin</nova:user>
Sep 30 07:21:17 compute-0 nova_compute[189265]:         <nova:project uuid="5914ab8585ff4a26a783d58aae38b75d">tempest-TestExecuteBasicStrategy-698431161</nova:project>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       </nova:owner>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <nova:root type="image" uuid="0c6b92f5-9861-49e4-862d-3ffd84520dfa"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <nova:ports>
Sep 30 07:21:17 compute-0 nova_compute[189265]:         <nova:port uuid="1624cd02-73d5-4555-b8de-b38f00887c31">
Sep 30 07:21:17 compute-0 nova_compute[189265]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:         </nova:port>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       </nova:ports>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </nova:instance>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   </metadata>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <memory unit="KiB">131072</memory>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <currentMemory unit="KiB">131072</currentMemory>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <vcpu placement="static">1</vcpu>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <resource>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <partition>/machine</partition>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   </resource>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <sysinfo type="smbios">
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <system>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <entry name="manufacturer">RDO</entry>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <entry name="product">OpenStack Compute</entry>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <entry name="serial">7951b572-4bd4-472b-99e6-32d37b2ea3fd</entry>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <entry name="uuid">7951b572-4bd4-472b-99e6-32d37b2ea3fd</entry>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <entry name="family">Virtual Machine</entry>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </system>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   </sysinfo>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <os>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <boot dev="hd"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <smbios mode="sysinfo"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   </os>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <features>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <acpi/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <apic/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <vmcoreinfo state="on"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   </features>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <cpu mode="host-model" check="partial">
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   </cpu>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <clock offset="utc">
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <timer name="hpet" present="no"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   </clock>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <on_poweroff>destroy</on_poweroff>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <on_reboot>restart</on_reboot>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <on_crash>destroy</on_crash>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <devices>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <disk type="file" device="disk">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <source file="/var/lib/nova/instances/7951b572-4bd4-472b-99e6-32d37b2ea3fd/disk"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target dev="vda" bus="virtio"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </disk>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <disk type="file" device="cdrom">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <source file="/var/lib/nova/instances/7951b572-4bd4-472b-99e6-32d37b2ea3fd/disk.config"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target dev="sda" bus="sata"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <readonly/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </disk>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="1" port="0x10"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="2" port="0x11"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="3" port="0x12"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="4" port="0x13"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="5" port="0x14"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="6" port="0x15"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="7" port="0x16"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="8" port="0x17"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="9" port="0x18"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="10" port="0x19"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="11" port="0x1a"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="12" port="0x1b"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="13" port="0x1c"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="14" port="0x1d"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="15" port="0x1e"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="16" port="0x1f"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="17" port="0x20"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="18" port="0x21"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="19" port="0x22"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="20" port="0x23"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="21" port="0x24"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="22" port="0x25"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="23" port="0x26"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="24" port="0x27"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="25" port="0x28"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-pci-bridge"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="sata" index="0">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <interface type="ethernet"><mac address="fa:16:3e:45:04:3d"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap1624cd02-73"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </interface><serial type="pty">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <log file="/var/lib/nova/instances/7951b572-4bd4-472b-99e6-32d37b2ea3fd/console.log" append="off"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target type="isa-serial" port="0">
Sep 30 07:21:17 compute-0 nova_compute[189265]:         <model name="isa-serial"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       </target>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </serial>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <console type="pty">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <log file="/var/lib/nova/instances/7951b572-4bd4-472b-99e6-32d37b2ea3fd/console.log" append="off"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target type="serial" port="0"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </console>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <input type="tablet" bus="usb">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="usb" bus="0" port="1"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </input>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <input type="mouse" bus="ps2"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <listen type="address" address="::"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </graphics>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <video>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </video>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <stats period="10"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </memballoon>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <rng model="virtio">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <backend model="random">/dev/urandom</backend>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </rng>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   </devices>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]: </domain>
Sep 30 07:21:17 compute-0 nova_compute[189265]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Sep 30 07:21:17 compute-0 nova_compute[189265]: 2025-09-30 07:21:17.327 2 DEBUG nova.virt.libvirt.migration [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <name>instance-00000009</name>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <uuid>7951b572-4bd4-472b-99e6-32d37b2ea3fd</uuid>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <metadata>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <nova:name>tempest-TestExecuteBasicStrategy-server-194258953</nova:name>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <nova:creationTime>2025-09-30 07:20:21</nova:creationTime>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <nova:flavor name="m1.nano" id="ded17455-f8fe-40c7-8dae-6f0a2b208ae0">
Sep 30 07:21:17 compute-0 nova_compute[189265]:         <nova:memory>128</nova:memory>
Sep 30 07:21:17 compute-0 nova_compute[189265]:         <nova:disk>1</nova:disk>
Sep 30 07:21:17 compute-0 nova_compute[189265]:         <nova:swap>0</nova:swap>
Sep 30 07:21:17 compute-0 nova_compute[189265]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 07:21:17 compute-0 nova_compute[189265]:         <nova:vcpus>1</nova:vcpus>
Sep 30 07:21:17 compute-0 nova_compute[189265]:         <nova:extraSpecs>
Sep 30 07:21:17 compute-0 nova_compute[189265]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 07:21:17 compute-0 nova_compute[189265]:         </nova:extraSpecs>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       </nova:flavor>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <nova:image uuid="0c6b92f5-9861-49e4-862d-3ffd84520dfa">
Sep 30 07:21:17 compute-0 nova_compute[189265]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 07:21:17 compute-0 nova_compute[189265]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 07:21:17 compute-0 nova_compute[189265]:         <nova:minDisk>1</nova:minDisk>
Sep 30 07:21:17 compute-0 nova_compute[189265]:         <nova:minRam>0</nova:minRam>
Sep 30 07:21:17 compute-0 nova_compute[189265]:         <nova:properties>
Sep 30 07:21:17 compute-0 nova_compute[189265]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 07:21:17 compute-0 nova_compute[189265]:         </nova:properties>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       </nova:image>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <nova:owner>
Sep 30 07:21:17 compute-0 nova_compute[189265]:         <nova:user uuid="3a4b8ff28f3345afb27f6afbb0a20f3b">tempest-TestExecuteBasicStrategy-698431161-project-admin</nova:user>
Sep 30 07:21:17 compute-0 nova_compute[189265]:         <nova:project uuid="5914ab8585ff4a26a783d58aae38b75d">tempest-TestExecuteBasicStrategy-698431161</nova:project>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       </nova:owner>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <nova:root type="image" uuid="0c6b92f5-9861-49e4-862d-3ffd84520dfa"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <nova:ports>
Sep 30 07:21:17 compute-0 nova_compute[189265]:         <nova:port uuid="1624cd02-73d5-4555-b8de-b38f00887c31">
Sep 30 07:21:17 compute-0 nova_compute[189265]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:         </nova:port>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       </nova:ports>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </nova:instance>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   </metadata>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <memory unit="KiB">131072</memory>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <currentMemory unit="KiB">131072</currentMemory>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <vcpu placement="static">1</vcpu>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <resource>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <partition>/machine</partition>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   </resource>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <sysinfo type="smbios">
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <system>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <entry name="manufacturer">RDO</entry>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <entry name="product">OpenStack Compute</entry>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <entry name="serial">7951b572-4bd4-472b-99e6-32d37b2ea3fd</entry>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <entry name="uuid">7951b572-4bd4-472b-99e6-32d37b2ea3fd</entry>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <entry name="family">Virtual Machine</entry>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </system>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   </sysinfo>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <os>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <boot dev="hd"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <smbios mode="sysinfo"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   </os>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <features>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <acpi/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <apic/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <vmcoreinfo state="on"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   </features>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <cpu mode="host-model" check="partial">
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   </cpu>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <clock offset="utc">
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <timer name="hpet" present="no"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   </clock>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <on_poweroff>destroy</on_poweroff>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <on_reboot>restart</on_reboot>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <on_crash>destroy</on_crash>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <devices>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <disk type="file" device="disk">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <source file="/var/lib/nova/instances/7951b572-4bd4-472b-99e6-32d37b2ea3fd/disk"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target dev="vda" bus="virtio"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </disk>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <disk type="file" device="cdrom">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <source file="/var/lib/nova/instances/7951b572-4bd4-472b-99e6-32d37b2ea3fd/disk.config"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target dev="sda" bus="sata"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <readonly/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </disk>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="1" port="0x10"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="2" port="0x11"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="3" port="0x12"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="4" port="0x13"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="5" port="0x14"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="6" port="0x15"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="7" port="0x16"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="8" port="0x17"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="9" port="0x18"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="10" port="0x19"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="11" port="0x1a"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="12" port="0x1b"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="13" port="0x1c"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="14" port="0x1d"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="15" port="0x1e"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="16" port="0x1f"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="17" port="0x20"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="18" port="0x21"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="19" port="0x22"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="20" port="0x23"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="21" port="0x24"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="22" port="0x25"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="23" port="0x26"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="24" port="0x27"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="25" port="0x28"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-pci-bridge"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="sata" index="0">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <interface type="ethernet"><mac address="fa:16:3e:45:04:3d"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap1624cd02-73"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </interface><serial type="pty">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <log file="/var/lib/nova/instances/7951b572-4bd4-472b-99e6-32d37b2ea3fd/console.log" append="off"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target type="isa-serial" port="0">
Sep 30 07:21:17 compute-0 nova_compute[189265]:         <model name="isa-serial"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       </target>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </serial>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <console type="pty">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <log file="/var/lib/nova/instances/7951b572-4bd4-472b-99e6-32d37b2ea3fd/console.log" append="off"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target type="serial" port="0"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </console>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <input type="tablet" bus="usb">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="usb" bus="0" port="1"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </input>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <input type="mouse" bus="ps2"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <listen type="address" address="::"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </graphics>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <video>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </video>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <stats period="10"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </memballoon>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <rng model="virtio">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <backend model="random">/dev/urandom</backend>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </rng>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   </devices>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]: </domain>
Sep 30 07:21:17 compute-0 nova_compute[189265]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Sep 30 07:21:17 compute-0 nova_compute[189265]: 2025-09-30 07:21:17.327 2 DEBUG nova.virt.libvirt.migration [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] _update_pci_xml output xml=<domain type="kvm">
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <name>instance-00000009</name>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <uuid>7951b572-4bd4-472b-99e6-32d37b2ea3fd</uuid>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <metadata>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <nova:name>tempest-TestExecuteBasicStrategy-server-194258953</nova:name>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <nova:creationTime>2025-09-30 07:20:21</nova:creationTime>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <nova:flavor name="m1.nano" id="ded17455-f8fe-40c7-8dae-6f0a2b208ae0">
Sep 30 07:21:17 compute-0 nova_compute[189265]:         <nova:memory>128</nova:memory>
Sep 30 07:21:17 compute-0 nova_compute[189265]:         <nova:disk>1</nova:disk>
Sep 30 07:21:17 compute-0 nova_compute[189265]:         <nova:swap>0</nova:swap>
Sep 30 07:21:17 compute-0 nova_compute[189265]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 07:21:17 compute-0 nova_compute[189265]:         <nova:vcpus>1</nova:vcpus>
Sep 30 07:21:17 compute-0 nova_compute[189265]:         <nova:extraSpecs>
Sep 30 07:21:17 compute-0 nova_compute[189265]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 07:21:17 compute-0 nova_compute[189265]:         </nova:extraSpecs>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       </nova:flavor>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <nova:image uuid="0c6b92f5-9861-49e4-862d-3ffd84520dfa">
Sep 30 07:21:17 compute-0 nova_compute[189265]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 07:21:17 compute-0 nova_compute[189265]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 07:21:17 compute-0 nova_compute[189265]:         <nova:minDisk>1</nova:minDisk>
Sep 30 07:21:17 compute-0 nova_compute[189265]:         <nova:minRam>0</nova:minRam>
Sep 30 07:21:17 compute-0 nova_compute[189265]:         <nova:properties>
Sep 30 07:21:17 compute-0 nova_compute[189265]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 07:21:17 compute-0 nova_compute[189265]:         </nova:properties>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       </nova:image>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <nova:owner>
Sep 30 07:21:17 compute-0 nova_compute[189265]:         <nova:user uuid="3a4b8ff28f3345afb27f6afbb0a20f3b">tempest-TestExecuteBasicStrategy-698431161-project-admin</nova:user>
Sep 30 07:21:17 compute-0 nova_compute[189265]:         <nova:project uuid="5914ab8585ff4a26a783d58aae38b75d">tempest-TestExecuteBasicStrategy-698431161</nova:project>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       </nova:owner>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <nova:root type="image" uuid="0c6b92f5-9861-49e4-862d-3ffd84520dfa"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <nova:ports>
Sep 30 07:21:17 compute-0 nova_compute[189265]:         <nova:port uuid="1624cd02-73d5-4555-b8de-b38f00887c31">
Sep 30 07:21:17 compute-0 nova_compute[189265]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:         </nova:port>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       </nova:ports>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </nova:instance>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   </metadata>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <memory unit="KiB">131072</memory>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <currentMemory unit="KiB">131072</currentMemory>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <vcpu placement="static">1</vcpu>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <resource>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <partition>/machine</partition>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   </resource>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <sysinfo type="smbios">
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <system>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <entry name="manufacturer">RDO</entry>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <entry name="product">OpenStack Compute</entry>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <entry name="serial">7951b572-4bd4-472b-99e6-32d37b2ea3fd</entry>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <entry name="uuid">7951b572-4bd4-472b-99e6-32d37b2ea3fd</entry>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <entry name="family">Virtual Machine</entry>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </system>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   </sysinfo>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <os>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <boot dev="hd"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <smbios mode="sysinfo"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   </os>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <features>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <acpi/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <apic/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <vmcoreinfo state="on"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   </features>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <cpu mode="host-model" check="partial">
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   </cpu>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <clock offset="utc">
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <timer name="hpet" present="no"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   </clock>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <on_poweroff>destroy</on_poweroff>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <on_reboot>restart</on_reboot>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <on_crash>destroy</on_crash>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <devices>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <disk type="file" device="disk">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <source file="/var/lib/nova/instances/7951b572-4bd4-472b-99e6-32d37b2ea3fd/disk"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target dev="vda" bus="virtio"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </disk>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <disk type="file" device="cdrom">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <source file="/var/lib/nova/instances/7951b572-4bd4-472b-99e6-32d37b2ea3fd/disk.config"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target dev="sda" bus="sata"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <readonly/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </disk>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="1" port="0x10"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="2" port="0x11"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="3" port="0x12"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="4" port="0x13"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="5" port="0x14"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="6" port="0x15"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="7" port="0x16"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="8" port="0x17"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="9" port="0x18"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="10" port="0x19"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="11" port="0x1a"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="12" port="0x1b"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="13" port="0x1c"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="14" port="0x1d"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="15" port="0x1e"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="16" port="0x1f"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="17" port="0x20"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="18" port="0x21"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="19" port="0x22"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="20" port="0x23"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="21" port="0x24"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="22" port="0x25"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="23" port="0x26"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="24" port="0x27"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target chassis="25" port="0x28"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model name="pcie-pci-bridge"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <controller type="sata" index="0">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <interface type="ethernet"><mac address="fa:16:3e:45:04:3d"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap1624cd02-73"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </interface><serial type="pty">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <log file="/var/lib/nova/instances/7951b572-4bd4-472b-99e6-32d37b2ea3fd/console.log" append="off"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target type="isa-serial" port="0">
Sep 30 07:21:17 compute-0 nova_compute[189265]:         <model name="isa-serial"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       </target>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </serial>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <console type="pty">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <log file="/var/lib/nova/instances/7951b572-4bd4-472b-99e6-32d37b2ea3fd/console.log" append="off"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <target type="serial" port="0"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </console>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <input type="tablet" bus="usb">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="usb" bus="0" port="1"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </input>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <input type="mouse" bus="ps2"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <listen type="address" address="::"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </graphics>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <video>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </video>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <stats period="10"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </memballoon>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     <rng model="virtio">
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <backend model="random">/dev/urandom</backend>
Sep 30 07:21:17 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]:     </rng>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   </devices>
Sep 30 07:21:17 compute-0 nova_compute[189265]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 07:21:17 compute-0 nova_compute[189265]: </domain>
Sep 30 07:21:17 compute-0 nova_compute[189265]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Sep 30 07:21:17 compute-0 nova_compute[189265]: 2025-09-30 07:21:17.328 2 DEBUG nova.virt.libvirt.driver [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Sep 30 07:21:17 compute-0 nova_compute[189265]: 2025-09-30 07:21:17.521 2 DEBUG oslo_concurrency.lockutils [req-b686fa8b-cf07-4041-a978-de7d69473f78 req-7025017d-6f9b-42be-92f2-6bb6ebc5f5a0 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Releasing lock "refresh_cache-7951b572-4bd4-472b-99e6-32d37b2ea3fd" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:21:17 compute-0 nova_compute[189265]: 2025-09-30 07:21:17.590 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:21:17 compute-0 nova_compute[189265]: 2025-09-30 07:21:17.817 2 DEBUG nova.virt.libvirt.migration [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Sep 30 07:21:17 compute-0 nova_compute[189265]: 2025-09-30 07:21:17.818 2 INFO nova.virt.libvirt.migration [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Increasing downtime to 50 ms after 0 sec elapsed time
Sep 30 07:21:18 compute-0 nova_compute[189265]: 2025-09-30 07:21:18.855 2 INFO nova.virt.libvirt.driver [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Sep 30 07:21:19 compute-0 kernel: tap1624cd02-73 (unregistering): left promiscuous mode
Sep 30 07:21:19 compute-0 NetworkManager[51813]: <info>  [1759216879.3040] device (tap1624cd02-73): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 07:21:19 compute-0 ovn_controller[91436]: 2025-09-30T07:21:19Z|00095|binding|INFO|Releasing lport 1624cd02-73d5-4555-b8de-b38f00887c31 from this chassis (sb_readonly=0)
Sep 30 07:21:19 compute-0 ovn_controller[91436]: 2025-09-30T07:21:19Z|00096|binding|INFO|Setting lport 1624cd02-73d5-4555-b8de-b38f00887c31 down in Southbound
Sep 30 07:21:19 compute-0 nova_compute[189265]: 2025-09-30 07:21:19.310 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:21:19 compute-0 ovn_controller[91436]: 2025-09-30T07:21:19Z|00097|binding|INFO|Removing iface tap1624cd02-73 ovn-installed in OVS
Sep 30 07:21:19 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:21:19.319 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:45:04:3d 10.100.0.6'], port_security=['fa:16:3e:45:04:3d 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '8a9138ed-8977-41ff-9b21-ff90eb637e78'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '7951b572-4bd4-472b-99e6-32d37b2ea3fd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8bd4c178-e5a2-4919-a1df-9c84df6c5788', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5914ab8585ff4a26a783d58aae38b75d', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'cb035a5e-818f-42f5-b01b-9cd518a289c3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0a8dcf8-2d25-4318-a682-128f51c53fdc, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], logical_port=1624cd02-73d5-4555-b8de-b38f00887c31) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:21:19 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:21:19.320 100322 INFO neutron.agent.ovn.metadata.agent [-] Port 1624cd02-73d5-4555-b8de-b38f00887c31 in datapath 8bd4c178-e5a2-4919-a1df-9c84df6c5788 unbound from our chassis
Sep 30 07:21:19 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:21:19.322 100322 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8bd4c178-e5a2-4919-a1df-9c84df6c5788, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 07:21:19 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:21:19.322 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[9d745f8d-2b01-4562-bc7f-d5228e8f8d63]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:21:19 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:21:19.323 100322 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8bd4c178-e5a2-4919-a1df-9c84df6c5788 namespace which is not needed anymore
Sep 30 07:21:19 compute-0 nova_compute[189265]: 2025-09-30 07:21:19.325 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:21:19 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000009.scope: Deactivated successfully.
Sep 30 07:21:19 compute-0 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000009.scope: Consumed 14.886s CPU time.
Sep 30 07:21:19 compute-0 systemd-machined[149233]: Machine qemu-6-instance-00000009 terminated.
Sep 30 07:21:19 compute-0 podman[215283]: 2025-09-30 07:21:19.401501623 +0000 UTC m=+0.071485510 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Sep 30 07:21:19 compute-0 neutron-haproxy-ovnmeta-8bd4c178-e5a2-4919-a1df-9c84df6c5788[215071]: [NOTICE]   (215075) : haproxy version is 3.0.5-8e879a5
Sep 30 07:21:19 compute-0 neutron-haproxy-ovnmeta-8bd4c178-e5a2-4919-a1df-9c84df6c5788[215071]: [NOTICE]   (215075) : path to executable is /usr/sbin/haproxy
Sep 30 07:21:19 compute-0 neutron-haproxy-ovnmeta-8bd4c178-e5a2-4919-a1df-9c84df6c5788[215071]: [WARNING]  (215075) : Exiting Master process...
Sep 30 07:21:19 compute-0 podman[215330]: 2025-09-30 07:21:19.428299217 +0000 UTC m=+0.027275088 container kill 102dda15f5ba0439dcce223e03fae3978ccc953b4fce5f4c56d813f134bda202 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-8bd4c178-e5a2-4919-a1df-9c84df6c5788, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4)
Sep 30 07:21:19 compute-0 neutron-haproxy-ovnmeta-8bd4c178-e5a2-4919-a1df-9c84df6c5788[215071]: [ALERT]    (215075) : Current worker (215077) exited with code 143 (Terminated)
Sep 30 07:21:19 compute-0 neutron-haproxy-ovnmeta-8bd4c178-e5a2-4919-a1df-9c84df6c5788[215071]: [WARNING]  (215075) : All workers exited. Exiting... (0)
Sep 30 07:21:19 compute-0 systemd[1]: libpod-102dda15f5ba0439dcce223e03fae3978ccc953b4fce5f4c56d813f134bda202.scope: Deactivated successfully.
Sep 30 07:21:19 compute-0 podman[215347]: 2025-09-30 07:21:19.46365611 +0000 UTC m=+0.022583501 container died 102dda15f5ba0439dcce223e03fae3978ccc953b4fce5f4c56d813f134bda202 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-8bd4c178-e5a2-4919-a1df-9c84df6c5788, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS)
Sep 30 07:21:19 compute-0 nova_compute[189265]: 2025-09-30 07:21:19.504 2 DEBUG nova.compute.manager [req-f8ffa868-4969-429c-a48b-97d418c9871c req-213e231a-6860-4841-bee7-d86b5b632688 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Received event network-vif-unplugged-1624cd02-73d5-4555-b8de-b38f00887c31 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:21:19 compute-0 nova_compute[189265]: 2025-09-30 07:21:19.504 2 DEBUG oslo_concurrency.lockutils [req-f8ffa868-4969-429c-a48b-97d418c9871c req-213e231a-6860-4841-bee7-d86b5b632688 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "7951b572-4bd4-472b-99e6-32d37b2ea3fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:21:19 compute-0 nova_compute[189265]: 2025-09-30 07:21:19.505 2 DEBUG oslo_concurrency.lockutils [req-f8ffa868-4969-429c-a48b-97d418c9871c req-213e231a-6860-4841-bee7-d86b5b632688 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "7951b572-4bd4-472b-99e6-32d37b2ea3fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:21:19 compute-0 nova_compute[189265]: 2025-09-30 07:21:19.505 2 DEBUG oslo_concurrency.lockutils [req-f8ffa868-4969-429c-a48b-97d418c9871c req-213e231a-6860-4841-bee7-d86b5b632688 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "7951b572-4bd4-472b-99e6-32d37b2ea3fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:21:19 compute-0 nova_compute[189265]: 2025-09-30 07:21:19.505 2 DEBUG nova.compute.manager [req-f8ffa868-4969-429c-a48b-97d418c9871c req-213e231a-6860-4841-bee7-d86b5b632688 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] No waiting events found dispatching network-vif-unplugged-1624cd02-73d5-4555-b8de-b38f00887c31 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:21:19 compute-0 nova_compute[189265]: 2025-09-30 07:21:19.505 2 DEBUG nova.compute.manager [req-f8ffa868-4969-429c-a48b-97d418c9871c req-213e231a-6860-4841-bee7-d86b5b632688 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Received event network-vif-unplugged-1624cd02-73d5-4555-b8de-b38f00887c31 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:21:19 compute-0 kernel: tap1624cd02-73: entered promiscuous mode
Sep 30 07:21:19 compute-0 NetworkManager[51813]: <info>  [1759216879.5064] manager: (tap1624cd02-73): new Tun device (/org/freedesktop/NetworkManager/Devices/39)
Sep 30 07:21:19 compute-0 nova_compute[189265]: 2025-09-30 07:21:19.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:21:19 compute-0 ovn_controller[91436]: 2025-09-30T07:21:19Z|00098|binding|INFO|Claiming lport 1624cd02-73d5-4555-b8de-b38f00887c31 for this chassis.
Sep 30 07:21:19 compute-0 systemd-udevd[215307]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 07:21:19 compute-0 ovn_controller[91436]: 2025-09-30T07:21:19Z|00099|binding|INFO|1624cd02-73d5-4555-b8de-b38f00887c31: Claiming fa:16:3e:45:04:3d 10.100.0.6
Sep 30 07:21:19 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:21:19.519 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:45:04:3d 10.100.0.6'], port_security=['fa:16:3e:45:04:3d 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '8a9138ed-8977-41ff-9b21-ff90eb637e78'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '7951b572-4bd4-472b-99e6-32d37b2ea3fd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8bd4c178-e5a2-4919-a1df-9c84df6c5788', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5914ab8585ff4a26a783d58aae38b75d', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'cb035a5e-818f-42f5-b01b-9cd518a289c3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0a8dcf8-2d25-4318-a682-128f51c53fdc, chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], logical_port=1624cd02-73d5-4555-b8de-b38f00887c31) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:21:19 compute-0 kernel: tap1624cd02-73 (unregistering): left promiscuous mode
Sep 30 07:21:19 compute-0 virtnodedevd[189614]: libvirt version: 10.10.0, package: 15.el9 (builder@centos.org, 2025-08-18-13:22:20, )
Sep 30 07:21:19 compute-0 virtnodedevd[189614]: hostname: compute-0
Sep 30 07:21:19 compute-0 virtnodedevd[189614]: ethtool ioctl error on tap1624cd02-73: No such device
Sep 30 07:21:19 compute-0 ovn_controller[91436]: 2025-09-30T07:21:19Z|00100|binding|INFO|Setting lport 1624cd02-73d5-4555-b8de-b38f00887c31 ovn-installed in OVS
Sep 30 07:21:19 compute-0 ovn_controller[91436]: 2025-09-30T07:21:19Z|00101|binding|INFO|Setting lport 1624cd02-73d5-4555-b8de-b38f00887c31 up in Southbound
Sep 30 07:21:19 compute-0 nova_compute[189265]: 2025-09-30 07:21:19.529 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:21:19 compute-0 ovn_controller[91436]: 2025-09-30T07:21:19Z|00102|binding|INFO|Releasing lport 1624cd02-73d5-4555-b8de-b38f00887c31 from this chassis (sb_readonly=0)
Sep 30 07:21:19 compute-0 ovn_controller[91436]: 2025-09-30T07:21:19Z|00103|binding|INFO|Setting lport 1624cd02-73d5-4555-b8de-b38f00887c31 down in Southbound
Sep 30 07:21:19 compute-0 ovn_controller[91436]: 2025-09-30T07:21:19Z|00104|binding|INFO|Removing iface tap1624cd02-73 ovn-installed in OVS
Sep 30 07:21:19 compute-0 nova_compute[189265]: 2025-09-30 07:21:19.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:21:19 compute-0 virtnodedevd[189614]: ethtool ioctl error on tap1624cd02-73: No such device
Sep 30 07:21:19 compute-0 virtnodedevd[189614]: ethtool ioctl error on tap1624cd02-73: No such device
Sep 30 07:21:19 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:21:19.542 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:45:04:3d 10.100.0.6'], port_security=['fa:16:3e:45:04:3d 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '8a9138ed-8977-41ff-9b21-ff90eb637e78'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '7951b572-4bd4-472b-99e6-32d37b2ea3fd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8bd4c178-e5a2-4919-a1df-9c84df6c5788', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5914ab8585ff4a26a783d58aae38b75d', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'cb035a5e-818f-42f5-b01b-9cd518a289c3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0a8dcf8-2d25-4318-a682-128f51c53fdc, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], logical_port=1624cd02-73d5-4555-b8de-b38f00887c31) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:21:19 compute-0 virtnodedevd[189614]: ethtool ioctl error on tap1624cd02-73: No such device
Sep 30 07:21:19 compute-0 virtnodedevd[189614]: ethtool ioctl error on tap1624cd02-73: No such device
Sep 30 07:21:19 compute-0 nova_compute[189265]: 2025-09-30 07:21:19.551 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:21:19 compute-0 virtnodedevd[189614]: ethtool ioctl error on tap1624cd02-73: No such device
Sep 30 07:21:19 compute-0 virtnodedevd[189614]: ethtool ioctl error on tap1624cd02-73: No such device
Sep 30 07:21:19 compute-0 virtnodedevd[189614]: ethtool ioctl error on tap1624cd02-73: No such device
Sep 30 07:21:19 compute-0 nova_compute[189265]: 2025-09-30 07:21:19.572 2 DEBUG nova.virt.libvirt.guest [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Sep 30 07:21:19 compute-0 nova_compute[189265]: 2025-09-30 07:21:19.573 2 INFO nova.virt.libvirt.driver [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Migration operation has completed
Sep 30 07:21:19 compute-0 nova_compute[189265]: 2025-09-30 07:21:19.573 2 INFO nova.compute.manager [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] _post_live_migration() is started..
Sep 30 07:21:19 compute-0 nova_compute[189265]: 2025-09-30 07:21:19.575 2 DEBUG nova.virt.libvirt.driver [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Sep 30 07:21:19 compute-0 nova_compute[189265]: 2025-09-30 07:21:19.575 2 DEBUG nova.virt.libvirt.driver [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Sep 30 07:21:19 compute-0 nova_compute[189265]: 2025-09-30 07:21:19.575 2 DEBUG nova.virt.libvirt.driver [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Sep 30 07:21:19 compute-0 nova_compute[189265]: 2025-09-30 07:21:19.591 2 WARNING neutronclient.v2_0.client [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:21:19 compute-0 nova_compute[189265]: 2025-09-30 07:21:19.591 2 WARNING neutronclient.v2_0.client [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:21:19 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-102dda15f5ba0439dcce223e03fae3978ccc953b4fce5f4c56d813f134bda202-userdata-shm.mount: Deactivated successfully.
Sep 30 07:21:19 compute-0 systemd[1]: var-lib-containers-storage-overlay-4474212ac8d7a0560e5f2d3fe7363a6d5192ac9f3449389f064a9904920cb44e-merged.mount: Deactivated successfully.
Sep 30 07:21:19 compute-0 podman[215347]: 2025-09-30 07:21:19.629194858 +0000 UTC m=+0.188122229 container cleanup 102dda15f5ba0439dcce223e03fae3978ccc953b4fce5f4c56d813f134bda202 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-8bd4c178-e5a2-4919-a1df-9c84df6c5788, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Sep 30 07:21:19 compute-0 systemd[1]: libpod-conmon-102dda15f5ba0439dcce223e03fae3978ccc953b4fce5f4c56d813f134bda202.scope: Deactivated successfully.
Sep 30 07:21:19 compute-0 podman[215360]: 2025-09-30 07:21:19.657480895 +0000 UTC m=+0.183371071 container remove 102dda15f5ba0439dcce223e03fae3978ccc953b4fce5f4c56d813f134bda202 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-8bd4c178-e5a2-4919-a1df-9c84df6c5788, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Sep 30 07:21:19 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:21:19.662 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[6aaecc43-4f06-4cb1-944a-bfc8ce69254a]: (4, ("Tue Sep 30 07:21:19 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-8bd4c178-e5a2-4919-a1df-9c84df6c5788 (102dda15f5ba0439dcce223e03fae3978ccc953b4fce5f4c56d813f134bda202)\n102dda15f5ba0439dcce223e03fae3978ccc953b4fce5f4c56d813f134bda202\nTue Sep 30 07:21:19 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-8bd4c178-e5a2-4919-a1df-9c84df6c5788 (102dda15f5ba0439dcce223e03fae3978ccc953b4fce5f4c56d813f134bda202)\n102dda15f5ba0439dcce223e03fae3978ccc953b4fce5f4c56d813f134bda202\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:21:19 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:21:19.663 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[8877fa9a-5a1f-47b9-8ffc-f8ac0bfd3edd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:21:19 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:21:19.664 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8bd4c178-e5a2-4919-a1df-9c84df6c5788.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8bd4c178-e5a2-4919-a1df-9c84df6c5788.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:21:19 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:21:19.664 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[53c2753b-1792-4a9e-b5ae-4c92b0afa059]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:21:19 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:21:19.665 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8bd4c178-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:21:19 compute-0 nova_compute[189265]: 2025-09-30 07:21:19.667 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:21:19 compute-0 kernel: tap8bd4c178-e0: left promiscuous mode
Sep 30 07:21:19 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:21:19.678 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '1a:26:7c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2e:60:fa:91:d0:34'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:21:19 compute-0 nova_compute[189265]: 2025-09-30 07:21:19.681 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:21:19 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:21:19.683 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[b3eeaa63-1ef4-40a9-9cb9-93a79a6a3f25]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:21:19 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:21:19.718 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[a88c8e5d-4de8-446e-9cbb-53a5b4b44ba4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:21:19 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:21:19.719 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[089e9110-562c-407a-929e-64c5eae42501]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:21:19 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:21:19.732 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[80187a62-ada1-400b-94f4-5adc77944c75]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 468330, 'reachable_time': 43139, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215409, 'error': None, 'target': 'ovnmeta-8bd4c178-e5a2-4919-a1df-9c84df6c5788', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:21:19 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:21:19.734 100440 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8bd4c178-e5a2-4919-a1df-9c84df6c5788 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Sep 30 07:21:19 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:21:19.734 100440 DEBUG oslo.privsep.daemon [-] privsep: reply[81255318-702b-4b43-9f1e-8172935b6c65]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:21:19 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:21:19.734 100322 INFO neutron.agent.ovn.metadata.agent [-] Port 1624cd02-73d5-4555-b8de-b38f00887c31 in datapath 8bd4c178-e5a2-4919-a1df-9c84df6c5788 unbound from our chassis
Sep 30 07:21:19 compute-0 systemd[1]: run-netns-ovnmeta\x2d8bd4c178\x2de5a2\x2d4919\x2da1df\x2d9c84df6c5788.mount: Deactivated successfully.
Sep 30 07:21:19 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:21:19.736 100322 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8bd4c178-e5a2-4919-a1df-9c84df6c5788, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 07:21:19 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:21:19.737 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[ce6785bb-9ae1-4efa-b401-793bc274ce8f]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:21:19 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:21:19.737 100322 INFO neutron.agent.ovn.metadata.agent [-] Port 1624cd02-73d5-4555-b8de-b38f00887c31 in datapath 8bd4c178-e5a2-4919-a1df-9c84df6c5788 unbound from our chassis
Sep 30 07:21:19 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:21:19.738 100322 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8bd4c178-e5a2-4919-a1df-9c84df6c5788, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 07:21:19 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:21:19.739 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[c5fb6962-73e7-4219-bcf2-ab0f1fe1dfe8]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:21:19 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:21:19.739 100322 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 07:21:19 compute-0 nova_compute[189265]: 2025-09-30 07:21:19.878 2 DEBUG nova.compute.manager [req-ba8c2a4c-5d66-4b47-a225-454b91c4c730 req-2726e3b6-7bf4-4713-8371-4cbf6081d679 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Received event network-vif-unplugged-1624cd02-73d5-4555-b8de-b38f00887c31 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:21:19 compute-0 nova_compute[189265]: 2025-09-30 07:21:19.879 2 DEBUG oslo_concurrency.lockutils [req-ba8c2a4c-5d66-4b47-a225-454b91c4c730 req-2726e3b6-7bf4-4713-8371-4cbf6081d679 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "7951b572-4bd4-472b-99e6-32d37b2ea3fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:21:19 compute-0 nova_compute[189265]: 2025-09-30 07:21:19.879 2 DEBUG oslo_concurrency.lockutils [req-ba8c2a4c-5d66-4b47-a225-454b91c4c730 req-2726e3b6-7bf4-4713-8371-4cbf6081d679 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "7951b572-4bd4-472b-99e6-32d37b2ea3fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:21:19 compute-0 nova_compute[189265]: 2025-09-30 07:21:19.880 2 DEBUG oslo_concurrency.lockutils [req-ba8c2a4c-5d66-4b47-a225-454b91c4c730 req-2726e3b6-7bf4-4713-8371-4cbf6081d679 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "7951b572-4bd4-472b-99e6-32d37b2ea3fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:21:19 compute-0 nova_compute[189265]: 2025-09-30 07:21:19.880 2 DEBUG nova.compute.manager [req-ba8c2a4c-5d66-4b47-a225-454b91c4c730 req-2726e3b6-7bf4-4713-8371-4cbf6081d679 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] No waiting events found dispatching network-vif-unplugged-1624cd02-73d5-4555-b8de-b38f00887c31 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:21:19 compute-0 nova_compute[189265]: 2025-09-30 07:21:19.881 2 DEBUG nova.compute.manager [req-ba8c2a4c-5d66-4b47-a225-454b91c4c730 req-2726e3b6-7bf4-4713-8371-4cbf6081d679 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Received event network-vif-unplugged-1624cd02-73d5-4555-b8de-b38f00887c31 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:21:20 compute-0 nova_compute[189265]: 2025-09-30 07:21:20.304 2 DEBUG nova.network.neutron [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Activated binding for port 1624cd02-73d5-4555-b8de-b38f00887c31 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Sep 30 07:21:20 compute-0 nova_compute[189265]: 2025-09-30 07:21:20.305 2 DEBUG nova.compute.manager [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "1624cd02-73d5-4555-b8de-b38f00887c31", "address": "fa:16:3e:45:04:3d", "network": {"id": "8bd4c178-e5a2-4919-a1df-9c84df6c5788", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-389279842-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d196370b58a64910bc1103fb42505b15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1624cd02-73", "ovs_interfaceid": "1624cd02-73d5-4555-b8de-b38f00887c31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10059
Sep 30 07:21:20 compute-0 nova_compute[189265]: 2025-09-30 07:21:20.306 2 DEBUG nova.virt.libvirt.vif [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-09-30T07:20:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteBasicStrategy-server-194258953',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutebasicstrategy-server-194258953',id=9,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T07:20:27Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='5914ab8585ff4a26a783d58aae38b75d',ramdisk_id='',reservation_id='r-cwou451d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteBasicStrategy-698431161',owner_user_name='tempest-TestExecuteBasicStrategy-698431161-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T07:20:57Z,user_data=None,user_id='3a4b8ff28f3345afb27f6afbb0a20f3b',uuid=7951b572-4bd4-472b-99e6-32d37b2ea3fd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1624cd02-73d5-4555-b8de-b38f00887c31", "address": "fa:16:3e:45:04:3d", "network": {"id": "8bd4c178-e5a2-4919-a1df-9c84df6c5788", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-389279842-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d196370b58a64910bc1103fb42505b15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1624cd02-73", "ovs_interfaceid": "1624cd02-73d5-4555-b8de-b38f00887c31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 07:21:20 compute-0 nova_compute[189265]: 2025-09-30 07:21:20.307 2 DEBUG nova.network.os_vif_util [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Converting VIF {"id": "1624cd02-73d5-4555-b8de-b38f00887c31", "address": "fa:16:3e:45:04:3d", "network": {"id": "8bd4c178-e5a2-4919-a1df-9c84df6c5788", "bridge": "br-int", "label": "tempest-TestExecuteBasicStrategy-389279842-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d196370b58a64910bc1103fb42505b15", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1624cd02-73", "ovs_interfaceid": "1624cd02-73d5-4555-b8de-b38f00887c31", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:21:20 compute-0 nova_compute[189265]: 2025-09-30 07:21:20.308 2 DEBUG nova.network.os_vif_util [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:45:04:3d,bridge_name='br-int',has_traffic_filtering=True,id=1624cd02-73d5-4555-b8de-b38f00887c31,network=Network(8bd4c178-e5a2-4919-a1df-9c84df6c5788),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1624cd02-73') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:21:20 compute-0 nova_compute[189265]: 2025-09-30 07:21:20.308 2 DEBUG os_vif [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:04:3d,bridge_name='br-int',has_traffic_filtering=True,id=1624cd02-73d5-4555-b8de-b38f00887c31,network=Network(8bd4c178-e5a2-4919-a1df-9c84df6c5788),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1624cd02-73') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 07:21:20 compute-0 nova_compute[189265]: 2025-09-30 07:21:20.311 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:21:20 compute-0 nova_compute[189265]: 2025-09-30 07:21:20.311 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1624cd02-73, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:21:20 compute-0 nova_compute[189265]: 2025-09-30 07:21:20.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:21:20 compute-0 nova_compute[189265]: 2025-09-30 07:21:20.315 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:21:20 compute-0 nova_compute[189265]: 2025-09-30 07:21:20.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:21:20 compute-0 nova_compute[189265]: 2025-09-30 07:21:20.316 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=0f1a0a96-9aae-4d89-8bbe-213b1055d993) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:21:20 compute-0 nova_compute[189265]: 2025-09-30 07:21:20.318 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:21:20 compute-0 nova_compute[189265]: 2025-09-30 07:21:20.319 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:21:20 compute-0 nova_compute[189265]: 2025-09-30 07:21:20.322 2 INFO os_vif [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:45:04:3d,bridge_name='br-int',has_traffic_filtering=True,id=1624cd02-73d5-4555-b8de-b38f00887c31,network=Network(8bd4c178-e5a2-4919-a1df-9c84df6c5788),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1624cd02-73')
Sep 30 07:21:20 compute-0 nova_compute[189265]: 2025-09-30 07:21:20.323 2 DEBUG oslo_concurrency.lockutils [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:21:20 compute-0 nova_compute[189265]: 2025-09-30 07:21:20.323 2 DEBUG oslo_concurrency.lockutils [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:21:20 compute-0 nova_compute[189265]: 2025-09-30 07:21:20.324 2 DEBUG oslo_concurrency.lockutils [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:21:20 compute-0 nova_compute[189265]: 2025-09-30 07:21:20.324 2 DEBUG nova.compute.manager [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10082
Sep 30 07:21:20 compute-0 nova_compute[189265]: 2025-09-30 07:21:20.325 2 INFO nova.virt.libvirt.driver [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Deleting instance files /var/lib/nova/instances/7951b572-4bd4-472b-99e6-32d37b2ea3fd_del
Sep 30 07:21:20 compute-0 nova_compute[189265]: 2025-09-30 07:21:20.326 2 INFO nova.virt.libvirt.driver [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Deletion of /var/lib/nova/instances/7951b572-4bd4-472b-99e6-32d37b2ea3fd_del complete
Sep 30 07:21:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:21:20.547 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:21:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:21:20.547 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:21:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:21:20.547 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:21:21 compute-0 nova_compute[189265]: 2025-09-30 07:21:21.589 2 DEBUG nova.compute.manager [req-f116b5f8-16d3-4444-8cca-8f0f0ed9acc1 req-9178c368-8695-441e-8cb3-2f5c4a3f6c5e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Received event network-vif-plugged-1624cd02-73d5-4555-b8de-b38f00887c31 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:21:21 compute-0 nova_compute[189265]: 2025-09-30 07:21:21.589 2 DEBUG oslo_concurrency.lockutils [req-f116b5f8-16d3-4444-8cca-8f0f0ed9acc1 req-9178c368-8695-441e-8cb3-2f5c4a3f6c5e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "7951b572-4bd4-472b-99e6-32d37b2ea3fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:21:21 compute-0 nova_compute[189265]: 2025-09-30 07:21:21.590 2 DEBUG oslo_concurrency.lockutils [req-f116b5f8-16d3-4444-8cca-8f0f0ed9acc1 req-9178c368-8695-441e-8cb3-2f5c4a3f6c5e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "7951b572-4bd4-472b-99e6-32d37b2ea3fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:21:21 compute-0 nova_compute[189265]: 2025-09-30 07:21:21.590 2 DEBUG oslo_concurrency.lockutils [req-f116b5f8-16d3-4444-8cca-8f0f0ed9acc1 req-9178c368-8695-441e-8cb3-2f5c4a3f6c5e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "7951b572-4bd4-472b-99e6-32d37b2ea3fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:21:21 compute-0 nova_compute[189265]: 2025-09-30 07:21:21.590 2 DEBUG nova.compute.manager [req-f116b5f8-16d3-4444-8cca-8f0f0ed9acc1 req-9178c368-8695-441e-8cb3-2f5c4a3f6c5e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] No waiting events found dispatching network-vif-plugged-1624cd02-73d5-4555-b8de-b38f00887c31 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:21:21 compute-0 nova_compute[189265]: 2025-09-30 07:21:21.591 2 WARNING nova.compute.manager [req-f116b5f8-16d3-4444-8cca-8f0f0ed9acc1 req-9178c368-8695-441e-8cb3-2f5c4a3f6c5e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Received unexpected event network-vif-plugged-1624cd02-73d5-4555-b8de-b38f00887c31 for instance with vm_state active and task_state migrating.
Sep 30 07:21:21 compute-0 nova_compute[189265]: 2025-09-30 07:21:21.591 2 DEBUG nova.compute.manager [req-f116b5f8-16d3-4444-8cca-8f0f0ed9acc1 req-9178c368-8695-441e-8cb3-2f5c4a3f6c5e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Received event network-vif-unplugged-1624cd02-73d5-4555-b8de-b38f00887c31 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:21:21 compute-0 nova_compute[189265]: 2025-09-30 07:21:21.591 2 DEBUG oslo_concurrency.lockutils [req-f116b5f8-16d3-4444-8cca-8f0f0ed9acc1 req-9178c368-8695-441e-8cb3-2f5c4a3f6c5e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "7951b572-4bd4-472b-99e6-32d37b2ea3fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:21:21 compute-0 nova_compute[189265]: 2025-09-30 07:21:21.592 2 DEBUG oslo_concurrency.lockutils [req-f116b5f8-16d3-4444-8cca-8f0f0ed9acc1 req-9178c368-8695-441e-8cb3-2f5c4a3f6c5e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "7951b572-4bd4-472b-99e6-32d37b2ea3fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:21:21 compute-0 nova_compute[189265]: 2025-09-30 07:21:21.592 2 DEBUG oslo_concurrency.lockutils [req-f116b5f8-16d3-4444-8cca-8f0f0ed9acc1 req-9178c368-8695-441e-8cb3-2f5c4a3f6c5e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "7951b572-4bd4-472b-99e6-32d37b2ea3fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:21:21 compute-0 nova_compute[189265]: 2025-09-30 07:21:21.593 2 DEBUG nova.compute.manager [req-f116b5f8-16d3-4444-8cca-8f0f0ed9acc1 req-9178c368-8695-441e-8cb3-2f5c4a3f6c5e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] No waiting events found dispatching network-vif-unplugged-1624cd02-73d5-4555-b8de-b38f00887c31 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:21:21 compute-0 nova_compute[189265]: 2025-09-30 07:21:21.593 2 DEBUG nova.compute.manager [req-f116b5f8-16d3-4444-8cca-8f0f0ed9acc1 req-9178c368-8695-441e-8cb3-2f5c4a3f6c5e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Received event network-vif-unplugged-1624cd02-73d5-4555-b8de-b38f00887c31 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:21:21 compute-0 nova_compute[189265]: 2025-09-30 07:21:21.593 2 DEBUG nova.compute.manager [req-f116b5f8-16d3-4444-8cca-8f0f0ed9acc1 req-9178c368-8695-441e-8cb3-2f5c4a3f6c5e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Received event network-vif-plugged-1624cd02-73d5-4555-b8de-b38f00887c31 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:21:21 compute-0 nova_compute[189265]: 2025-09-30 07:21:21.594 2 DEBUG oslo_concurrency.lockutils [req-f116b5f8-16d3-4444-8cca-8f0f0ed9acc1 req-9178c368-8695-441e-8cb3-2f5c4a3f6c5e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "7951b572-4bd4-472b-99e6-32d37b2ea3fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:21:21 compute-0 nova_compute[189265]: 2025-09-30 07:21:21.594 2 DEBUG oslo_concurrency.lockutils [req-f116b5f8-16d3-4444-8cca-8f0f0ed9acc1 req-9178c368-8695-441e-8cb3-2f5c4a3f6c5e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "7951b572-4bd4-472b-99e6-32d37b2ea3fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:21:21 compute-0 nova_compute[189265]: 2025-09-30 07:21:21.594 2 DEBUG oslo_concurrency.lockutils [req-f116b5f8-16d3-4444-8cca-8f0f0ed9acc1 req-9178c368-8695-441e-8cb3-2f5c4a3f6c5e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "7951b572-4bd4-472b-99e6-32d37b2ea3fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:21:21 compute-0 nova_compute[189265]: 2025-09-30 07:21:21.595 2 DEBUG nova.compute.manager [req-f116b5f8-16d3-4444-8cca-8f0f0ed9acc1 req-9178c368-8695-441e-8cb3-2f5c4a3f6c5e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] No waiting events found dispatching network-vif-plugged-1624cd02-73d5-4555-b8de-b38f00887c31 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:21:21 compute-0 nova_compute[189265]: 2025-09-30 07:21:21.595 2 WARNING nova.compute.manager [req-f116b5f8-16d3-4444-8cca-8f0f0ed9acc1 req-9178c368-8695-441e-8cb3-2f5c4a3f6c5e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Received unexpected event network-vif-plugged-1624cd02-73d5-4555-b8de-b38f00887c31 for instance with vm_state active and task_state migrating.
Sep 30 07:21:21 compute-0 nova_compute[189265]: 2025-09-30 07:21:21.595 2 DEBUG nova.compute.manager [req-f116b5f8-16d3-4444-8cca-8f0f0ed9acc1 req-9178c368-8695-441e-8cb3-2f5c4a3f6c5e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Received event network-vif-unplugged-1624cd02-73d5-4555-b8de-b38f00887c31 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:21:21 compute-0 nova_compute[189265]: 2025-09-30 07:21:21.596 2 DEBUG oslo_concurrency.lockutils [req-f116b5f8-16d3-4444-8cca-8f0f0ed9acc1 req-9178c368-8695-441e-8cb3-2f5c4a3f6c5e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "7951b572-4bd4-472b-99e6-32d37b2ea3fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:21:21 compute-0 nova_compute[189265]: 2025-09-30 07:21:21.596 2 DEBUG oslo_concurrency.lockutils [req-f116b5f8-16d3-4444-8cca-8f0f0ed9acc1 req-9178c368-8695-441e-8cb3-2f5c4a3f6c5e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "7951b572-4bd4-472b-99e6-32d37b2ea3fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:21:21 compute-0 nova_compute[189265]: 2025-09-30 07:21:21.597 2 DEBUG oslo_concurrency.lockutils [req-f116b5f8-16d3-4444-8cca-8f0f0ed9acc1 req-9178c368-8695-441e-8cb3-2f5c4a3f6c5e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "7951b572-4bd4-472b-99e6-32d37b2ea3fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:21:21 compute-0 nova_compute[189265]: 2025-09-30 07:21:21.597 2 DEBUG nova.compute.manager [req-f116b5f8-16d3-4444-8cca-8f0f0ed9acc1 req-9178c368-8695-441e-8cb3-2f5c4a3f6c5e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] No waiting events found dispatching network-vif-unplugged-1624cd02-73d5-4555-b8de-b38f00887c31 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:21:21 compute-0 nova_compute[189265]: 2025-09-30 07:21:21.597 2 DEBUG nova.compute.manager [req-f116b5f8-16d3-4444-8cca-8f0f0ed9acc1 req-9178c368-8695-441e-8cb3-2f5c4a3f6c5e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Received event network-vif-unplugged-1624cd02-73d5-4555-b8de-b38f00887c31 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:21:21 compute-0 nova_compute[189265]: 2025-09-30 07:21:21.598 2 DEBUG nova.compute.manager [req-f116b5f8-16d3-4444-8cca-8f0f0ed9acc1 req-9178c368-8695-441e-8cb3-2f5c4a3f6c5e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Received event network-vif-plugged-1624cd02-73d5-4555-b8de-b38f00887c31 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:21:21 compute-0 nova_compute[189265]: 2025-09-30 07:21:21.598 2 DEBUG oslo_concurrency.lockutils [req-f116b5f8-16d3-4444-8cca-8f0f0ed9acc1 req-9178c368-8695-441e-8cb3-2f5c4a3f6c5e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "7951b572-4bd4-472b-99e6-32d37b2ea3fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:21:21 compute-0 nova_compute[189265]: 2025-09-30 07:21:21.598 2 DEBUG oslo_concurrency.lockutils [req-f116b5f8-16d3-4444-8cca-8f0f0ed9acc1 req-9178c368-8695-441e-8cb3-2f5c4a3f6c5e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "7951b572-4bd4-472b-99e6-32d37b2ea3fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:21:21 compute-0 nova_compute[189265]: 2025-09-30 07:21:21.599 2 DEBUG oslo_concurrency.lockutils [req-f116b5f8-16d3-4444-8cca-8f0f0ed9acc1 req-9178c368-8695-441e-8cb3-2f5c4a3f6c5e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "7951b572-4bd4-472b-99e6-32d37b2ea3fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:21:21 compute-0 nova_compute[189265]: 2025-09-30 07:21:21.599 2 DEBUG nova.compute.manager [req-f116b5f8-16d3-4444-8cca-8f0f0ed9acc1 req-9178c368-8695-441e-8cb3-2f5c4a3f6c5e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] No waiting events found dispatching network-vif-plugged-1624cd02-73d5-4555-b8de-b38f00887c31 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:21:21 compute-0 nova_compute[189265]: 2025-09-30 07:21:21.599 2 WARNING nova.compute.manager [req-f116b5f8-16d3-4444-8cca-8f0f0ed9acc1 req-9178c368-8695-441e-8cb3-2f5c4a3f6c5e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Received unexpected event network-vif-plugged-1624cd02-73d5-4555-b8de-b38f00887c31 for instance with vm_state active and task_state migrating.
Sep 30 07:21:21 compute-0 nova_compute[189265]: 2025-09-30 07:21:21.600 2 DEBUG nova.compute.manager [req-f116b5f8-16d3-4444-8cca-8f0f0ed9acc1 req-9178c368-8695-441e-8cb3-2f5c4a3f6c5e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Received event network-vif-plugged-1624cd02-73d5-4555-b8de-b38f00887c31 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:21:21 compute-0 nova_compute[189265]: 2025-09-30 07:21:21.600 2 DEBUG oslo_concurrency.lockutils [req-f116b5f8-16d3-4444-8cca-8f0f0ed9acc1 req-9178c368-8695-441e-8cb3-2f5c4a3f6c5e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "7951b572-4bd4-472b-99e6-32d37b2ea3fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:21:21 compute-0 nova_compute[189265]: 2025-09-30 07:21:21.600 2 DEBUG oslo_concurrency.lockutils [req-f116b5f8-16d3-4444-8cca-8f0f0ed9acc1 req-9178c368-8695-441e-8cb3-2f5c4a3f6c5e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "7951b572-4bd4-472b-99e6-32d37b2ea3fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:21:21 compute-0 nova_compute[189265]: 2025-09-30 07:21:21.601 2 DEBUG oslo_concurrency.lockutils [req-f116b5f8-16d3-4444-8cca-8f0f0ed9acc1 req-9178c368-8695-441e-8cb3-2f5c4a3f6c5e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "7951b572-4bd4-472b-99e6-32d37b2ea3fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:21:21 compute-0 nova_compute[189265]: 2025-09-30 07:21:21.601 2 DEBUG nova.compute.manager [req-f116b5f8-16d3-4444-8cca-8f0f0ed9acc1 req-9178c368-8695-441e-8cb3-2f5c4a3f6c5e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] No waiting events found dispatching network-vif-plugged-1624cd02-73d5-4555-b8de-b38f00887c31 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:21:21 compute-0 nova_compute[189265]: 2025-09-30 07:21:21.601 2 WARNING nova.compute.manager [req-f116b5f8-16d3-4444-8cca-8f0f0ed9acc1 req-9178c368-8695-441e-8cb3-2f5c4a3f6c5e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Received unexpected event network-vif-plugged-1624cd02-73d5-4555-b8de-b38f00887c31 for instance with vm_state active and task_state migrating.
Sep 30 07:21:22 compute-0 nova_compute[189265]: 2025-09-30 07:21:22.164 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:21:23 compute-0 podman[215412]: 2025-09-30 07:21:23.511490658 +0000 UTC m=+0.082045388 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, config_id=edpm, io.buildah.version=1.33.7, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, architecture=x86_64, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public)
Sep 30 07:21:25 compute-0 nova_compute[189265]: 2025-09-30 07:21:25.318 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:21:25 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:21:25.740 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=01429670-4ea1-4dab-babc-4bc628cc01bb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:21:26 compute-0 podman[215435]: 2025-09-30 07:21:26.517920901 +0000 UTC m=+0.080690409 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Sep 30 07:21:26 compute-0 podman[215434]: 2025-09-30 07:21:26.558868567 +0000 UTC m=+0.128918708 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Sep 30 07:21:26 compute-0 podman[215436]: 2025-09-30 07:21:26.573531276 +0000 UTC m=+0.129864596 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Sep 30 07:21:27 compute-0 nova_compute[189265]: 2025-09-30 07:21:27.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:21:29 compute-0 podman[199733]: time="2025-09-30T07:21:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:21:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:21:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:21:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:21:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3002 "" "Go-http-client/1.1"
Sep 30 07:21:30 compute-0 nova_compute[189265]: 2025-09-30 07:21:30.320 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:21:30 compute-0 nova_compute[189265]: 2025-09-30 07:21:30.366 2 DEBUG oslo_concurrency.lockutils [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "7951b572-4bd4-472b-99e6-32d37b2ea3fd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:21:30 compute-0 nova_compute[189265]: 2025-09-30 07:21:30.366 2 DEBUG oslo_concurrency.lockutils [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "7951b572-4bd4-472b-99e6-32d37b2ea3fd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:21:30 compute-0 nova_compute[189265]: 2025-09-30 07:21:30.366 2 DEBUG oslo_concurrency.lockutils [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "7951b572-4bd4-472b-99e6-32d37b2ea3fd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:21:30 compute-0 nova_compute[189265]: 2025-09-30 07:21:30.887 2 DEBUG oslo_concurrency.lockutils [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:21:30 compute-0 nova_compute[189265]: 2025-09-30 07:21:30.888 2 DEBUG oslo_concurrency.lockutils [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:21:30 compute-0 nova_compute[189265]: 2025-09-30 07:21:30.888 2 DEBUG oslo_concurrency.lockutils [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:21:30 compute-0 nova_compute[189265]: 2025-09-30 07:21:30.889 2 DEBUG nova.compute.resource_tracker [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:21:31 compute-0 nova_compute[189265]: 2025-09-30 07:21:31.109 2 WARNING nova.virt.libvirt.driver [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:21:31 compute-0 nova_compute[189265]: 2025-09-30 07:21:31.111 2 DEBUG oslo_concurrency.processutils [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:21:31 compute-0 nova_compute[189265]: 2025-09-30 07:21:31.131 2 DEBUG oslo_concurrency.processutils [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "env LANG=C uptime" returned: 0 in 0.020s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:21:31 compute-0 nova_compute[189265]: 2025-09-30 07:21:31.132 2 DEBUG nova.compute.resource_tracker [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5851MB free_disk=73.30792617797852GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:21:31 compute-0 nova_compute[189265]: 2025-09-30 07:21:31.132 2 DEBUG oslo_concurrency.lockutils [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:21:31 compute-0 nova_compute[189265]: 2025-09-30 07:21:31.132 2 DEBUG oslo_concurrency.lockutils [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:21:31 compute-0 openstack_network_exporter[201859]: ERROR   07:21:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:21:31 compute-0 openstack_network_exporter[201859]: ERROR   07:21:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:21:31 compute-0 openstack_network_exporter[201859]: ERROR   07:21:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:21:31 compute-0 openstack_network_exporter[201859]: ERROR   07:21:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:21:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:21:31 compute-0 openstack_network_exporter[201859]: ERROR   07:21:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:21:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:21:32 compute-0 nova_compute[189265]: 2025-09-30 07:21:32.154 2 DEBUG nova.compute.resource_tracker [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Migration for instance 7951b572-4bd4-472b-99e6-32d37b2ea3fd refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Sep 30 07:21:32 compute-0 nova_compute[189265]: 2025-09-30 07:21:32.196 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:21:32 compute-0 nova_compute[189265]: 2025-09-30 07:21:32.664 2 DEBUG nova.compute.resource_tracker [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Sep 30 07:21:32 compute-0 nova_compute[189265]: 2025-09-30 07:21:32.706 2 DEBUG nova.compute.resource_tracker [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Migration 2c5292e0-a61d-4dcf-9cf7-809a5920f6b8 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Sep 30 07:21:32 compute-0 nova_compute[189265]: 2025-09-30 07:21:32.707 2 DEBUG nova.compute.resource_tracker [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:21:32 compute-0 nova_compute[189265]: 2025-09-30 07:21:32.708 2 DEBUG nova.compute.resource_tracker [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:21:31 up  1:19,  0 user,  load average: 0.14, 0.31, 0.42\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:21:32 compute-0 nova_compute[189265]: 2025-09-30 07:21:32.757 2 DEBUG nova.compute.provider_tree [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:21:33 compute-0 nova_compute[189265]: 2025-09-30 07:21:33.267 2 DEBUG nova.scheduler.client.report [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:21:33 compute-0 nova_compute[189265]: 2025-09-30 07:21:33.780 2 DEBUG nova.compute.resource_tracker [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:21:33 compute-0 nova_compute[189265]: 2025-09-30 07:21:33.782 2 DEBUG oslo_concurrency.lockutils [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.649s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:21:33 compute-0 nova_compute[189265]: 2025-09-30 07:21:33.805 2 INFO nova.compute.manager [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Sep 30 07:21:34 compute-0 nova_compute[189265]: 2025-09-30 07:21:34.784 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:21:34 compute-0 nova_compute[189265]: 2025-09-30 07:21:34.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:21:34 compute-0 nova_compute[189265]: 2025-09-30 07:21:34.925 2 INFO nova.scheduler.client.report [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Deleted allocation for migration 2c5292e0-a61d-4dcf-9cf7-809a5920f6b8
Sep 30 07:21:34 compute-0 nova_compute[189265]: 2025-09-30 07:21:34.926 2 DEBUG nova.virt.libvirt.driver [None req-3f915426-b884-44d1-bb07-6b3cfe4be2b9 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7951b572-4bd4-472b-99e6-32d37b2ea3fd] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Sep 30 07:21:35 compute-0 nova_compute[189265]: 2025-09-30 07:21:35.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:21:36 compute-0 nova_compute[189265]: 2025-09-30 07:21:36.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:21:37 compute-0 nova_compute[189265]: 2025-09-30 07:21:37.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:21:37 compute-0 nova_compute[189265]: 2025-09-30 07:21:37.784 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:21:38 compute-0 nova_compute[189265]: 2025-09-30 07:21:38.293 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:21:38 compute-0 nova_compute[189265]: 2025-09-30 07:21:38.293 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:21:40 compute-0 nova_compute[189265]: 2025-09-30 07:21:40.324 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:21:40 compute-0 nova_compute[189265]: 2025-09-30 07:21:40.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:21:40 compute-0 nova_compute[189265]: 2025-09-30 07:21:40.789 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:21:41 compute-0 nova_compute[189265]: 2025-09-30 07:21:41.300 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:21:41 compute-0 nova_compute[189265]: 2025-09-30 07:21:41.301 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:21:41 compute-0 nova_compute[189265]: 2025-09-30 07:21:41.301 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:21:41 compute-0 nova_compute[189265]: 2025-09-30 07:21:41.301 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:21:41 compute-0 nova_compute[189265]: 2025-09-30 07:21:41.469 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:21:41 compute-0 nova_compute[189265]: 2025-09-30 07:21:41.470 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:21:41 compute-0 nova_compute[189265]: 2025-09-30 07:21:41.485 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.015s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:21:41 compute-0 nova_compute[189265]: 2025-09-30 07:21:41.486 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5850MB free_disk=73.30792617797852GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:21:41 compute-0 nova_compute[189265]: 2025-09-30 07:21:41.486 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:21:41 compute-0 nova_compute[189265]: 2025-09-30 07:21:41.486 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:21:41 compute-0 podman[215498]: 2025-09-30 07:21:41.495583165 +0000 UTC m=+0.064018500 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 07:21:42 compute-0 nova_compute[189265]: 2025-09-30 07:21:42.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:21:42 compute-0 nova_compute[189265]: 2025-09-30 07:21:42.535 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:21:42 compute-0 nova_compute[189265]: 2025-09-30 07:21:42.535 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:21:41 up  1:19,  0 user,  load average: 0.12, 0.30, 0.41\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:21:42 compute-0 nova_compute[189265]: 2025-09-30 07:21:42.556 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:21:43 compute-0 nova_compute[189265]: 2025-09-30 07:21:43.103 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:21:43 compute-0 nova_compute[189265]: 2025-09-30 07:21:43.614 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:21:43 compute-0 nova_compute[189265]: 2025-09-30 07:21:43.614 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.128s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:21:44 compute-0 nova_compute[189265]: 2025-09-30 07:21:44.613 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:21:45 compute-0 nova_compute[189265]: 2025-09-30 07:21:45.326 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:21:45 compute-0 nova_compute[189265]: 2025-09-30 07:21:45.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:21:47 compute-0 nova_compute[189265]: 2025-09-30 07:21:47.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:21:48 compute-0 nova_compute[189265]: 2025-09-30 07:21:48.651 2 DEBUG nova.compute.manager [None req-d0b2c741-0299-436f-be0a-6dfd5ecd1bf3 bddd62d17bac483fb429dd18b1062646 4049964ce8244dacb50493f6676c6613 - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:631
Sep 30 07:21:48 compute-0 nova_compute[189265]: 2025-09-30 07:21:48.720 2 DEBUG nova.compute.provider_tree [None req-d0b2c741-0299-436f-be0a-6dfd5ecd1bf3 bddd62d17bac483fb429dd18b1062646 4049964ce8244dacb50493f6676c6613 - - default default] Updating resource provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc generation from 10 to 13 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Sep 30 07:21:50 compute-0 nova_compute[189265]: 2025-09-30 07:21:50.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:21:50 compute-0 podman[215523]: 2025-09-30 07:21:50.501838203 +0000 UTC m=+0.079615560 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_id=iscsid, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Sep 30 07:21:51 compute-0 nova_compute[189265]: 2025-09-30 07:21:51.288 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:21:52 compute-0 nova_compute[189265]: 2025-09-30 07:21:52.265 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:21:54 compute-0 podman[215543]: 2025-09-30 07:21:54.477299581 +0000 UTC m=+0.064841744 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.6, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., distribution-scope=public, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, vendor=Red Hat, Inc., architecture=x86_64, name=ubi9-minimal, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.33.7)
Sep 30 07:21:55 compute-0 nova_compute[189265]: 2025-09-30 07:21:55.330 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:21:57 compute-0 nova_compute[189265]: 2025-09-30 07:21:57.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:21:57 compute-0 podman[215565]: 2025-09-30 07:21:57.511800634 +0000 UTC m=+0.084539572 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd)
Sep 30 07:21:57 compute-0 podman[215566]: 2025-09-30 07:21:57.52826989 +0000 UTC m=+0.094636594 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 07:21:57 compute-0 podman[215567]: 2025-09-30 07:21:57.569549622 +0000 UTC m=+0.130935712 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller)
Sep 30 07:21:59 compute-0 podman[199733]: time="2025-09-30T07:21:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:21:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:21:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:21:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:21:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3004 "" "Go-http-client/1.1"
Sep 30 07:22:00 compute-0 nova_compute[189265]: 2025-09-30 07:22:00.333 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:22:01 compute-0 openstack_network_exporter[201859]: ERROR   07:22:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:22:01 compute-0 openstack_network_exporter[201859]: ERROR   07:22:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:22:01 compute-0 openstack_network_exporter[201859]: ERROR   07:22:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:22:01 compute-0 openstack_network_exporter[201859]: ERROR   07:22:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:22:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:22:01 compute-0 openstack_network_exporter[201859]: ERROR   07:22:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:22:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:22:02 compute-0 nova_compute[189265]: 2025-09-30 07:22:02.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:22:05 compute-0 nova_compute[189265]: 2025-09-30 07:22:05.337 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:22:07 compute-0 nova_compute[189265]: 2025-09-30 07:22:07.294 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:22:07 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:22:07.498 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1a:2c:4d 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-ec9004ba-bd6c-4e20-b00d-feb88b99ee44', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ec9004ba-bd6c-4e20-b00d-feb88b99ee44', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '184a280afe8c4486b678a6fa22680b9c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9400120a-0c15-4215-9e68-bf0999d55b59, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=b8c8f789-2f43-4eab-8ce2-27262b3b0364) old=Port_Binding(mac=['fa:16:3e:1a:2c:4d'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-ec9004ba-bd6c-4e20-b00d-feb88b99ee44', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ec9004ba-bd6c-4e20-b00d-feb88b99ee44', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '184a280afe8c4486b678a6fa22680b9c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:22:07 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:22:07.500 100322 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port b8c8f789-2f43-4eab-8ce2-27262b3b0364 in datapath ec9004ba-bd6c-4e20-b00d-feb88b99ee44 updated
Sep 30 07:22:07 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:22:07.501 100322 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ec9004ba-bd6c-4e20-b00d-feb88b99ee44, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 07:22:07 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:22:07.502 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[7b390a1d-7e69-4f16-a405-548ed9c6f797]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:22:10 compute-0 nova_compute[189265]: 2025-09-30 07:22:10.339 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:22:12 compute-0 nova_compute[189265]: 2025-09-30 07:22:12.296 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:22:12 compute-0 podman[215631]: 2025-09-30 07:22:12.510239132 +0000 UTC m=+0.084808680 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 07:22:15 compute-0 nova_compute[189265]: 2025-09-30 07:22:15.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:22:15 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:22:15.620 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f0:05:09 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-82419866-45b4-454e-be82-f59ce5cad550', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-82419866-45b4-454e-be82-f59ce5cad550', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5cddff1fdc22456a82f601ab6c1abfc2', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5011f877-c766-41ae-918e-a1f15fc1a327, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=1910a9dd-fc62-4a3a-a248-dba577b56865) old=Port_Binding(mac=['fa:16:3e:f0:05:09'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-82419866-45b4-454e-be82-f59ce5cad550', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-82419866-45b4-454e-be82-f59ce5cad550', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5cddff1fdc22456a82f601ab6c1abfc2', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:22:15 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:22:15.621 100322 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 1910a9dd-fc62-4a3a-a248-dba577b56865 in datapath 82419866-45b4-454e-be82-f59ce5cad550 updated
Sep 30 07:22:15 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:22:15.622 100322 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 82419866-45b4-454e-be82-f59ce5cad550, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 07:22:15 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:22:15.622 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[5f855c9b-797e-4e57-b37e-d020a0a2355a]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:22:17 compute-0 nova_compute[189265]: 2025-09-30 07:22:17.299 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:22:20 compute-0 nova_compute[189265]: 2025-09-30 07:22:20.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:22:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:22:20.548 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:22:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:22:20.549 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:22:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:22:20.549 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:22:21 compute-0 podman[215657]: 2025-09-30 07:22:21.50665687 +0000 UTC m=+0.082131043 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, tcib_managed=true, container_name=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Sep 30 07:22:22 compute-0 nova_compute[189265]: 2025-09-30 07:22:22.344 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:22:25 compute-0 ovn_controller[91436]: 2025-09-30T07:22:25Z|00105|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Sep 30 07:22:25 compute-0 nova_compute[189265]: 2025-09-30 07:22:25.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:22:25 compute-0 podman[215678]: 2025-09-30 07:22:25.4579988 +0000 UTC m=+0.050222621 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, release=1755695350, architecture=x86_64, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, managed_by=edpm_ansible)
Sep 30 07:22:27 compute-0 nova_compute[189265]: 2025-09-30 07:22:27.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:22:28 compute-0 podman[215700]: 2025-09-30 07:22:28.48717814 +0000 UTC m=+0.068720256 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20250930)
Sep 30 07:22:28 compute-0 podman[215701]: 2025-09-30 07:22:28.507302401 +0000 UTC m=+0.077224171 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20250930)
Sep 30 07:22:28 compute-0 podman[215702]: 2025-09-30 07:22:28.562581557 +0000 UTC m=+0.127467472 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Sep 30 07:22:29 compute-0 podman[199733]: time="2025-09-30T07:22:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:22:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:22:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:22:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:22:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3004 "" "Go-http-client/1.1"
Sep 30 07:22:30 compute-0 nova_compute[189265]: 2025-09-30 07:22:30.349 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:22:31 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:22:31.086 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '1a:26:7c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2e:60:fa:91:d0:34'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:22:31 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:22:31.087 100322 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 07:22:31 compute-0 nova_compute[189265]: 2025-09-30 07:22:31.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:22:31 compute-0 openstack_network_exporter[201859]: ERROR   07:22:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:22:31 compute-0 openstack_network_exporter[201859]: ERROR   07:22:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:22:31 compute-0 openstack_network_exporter[201859]: ERROR   07:22:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:22:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:22:31 compute-0 openstack_network_exporter[201859]: ERROR   07:22:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:22:31 compute-0 openstack_network_exporter[201859]: ERROR   07:22:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:22:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:22:32 compute-0 nova_compute[189265]: 2025-09-30 07:22:32.349 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:22:34 compute-0 nova_compute[189265]: 2025-09-30 07:22:34.784 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:22:34 compute-0 nova_compute[189265]: 2025-09-30 07:22:34.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:22:35 compute-0 nova_compute[189265]: 2025-09-30 07:22:35.351 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:22:37 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:22:37.089 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=01429670-4ea1-4dab-babc-4bc628cc01bb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:22:37 compute-0 nova_compute[189265]: 2025-09-30 07:22:37.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:22:37 compute-0 nova_compute[189265]: 2025-09-30 07:22:37.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:22:37 compute-0 nova_compute[189265]: 2025-09-30 07:22:37.787 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:22:38 compute-0 nova_compute[189265]: 2025-09-30 07:22:38.789 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:22:40 compute-0 nova_compute[189265]: 2025-09-30 07:22:40.353 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:22:40 compute-0 nova_compute[189265]: 2025-09-30 07:22:40.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:22:40 compute-0 nova_compute[189265]: 2025-09-30 07:22:40.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:22:41 compute-0 nova_compute[189265]: 2025-09-30 07:22:41.307 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:22:41 compute-0 nova_compute[189265]: 2025-09-30 07:22:41.307 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:22:41 compute-0 nova_compute[189265]: 2025-09-30 07:22:41.308 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:22:41 compute-0 nova_compute[189265]: 2025-09-30 07:22:41.308 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:22:41 compute-0 nova_compute[189265]: 2025-09-30 07:22:41.465 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:22:41 compute-0 nova_compute[189265]: 2025-09-30 07:22:41.467 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:22:41 compute-0 nova_compute[189265]: 2025-09-30 07:22:41.488 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.021s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:22:41 compute-0 nova_compute[189265]: 2025-09-30 07:22:41.489 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5839MB free_disk=73.30794525146484GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:22:41 compute-0 nova_compute[189265]: 2025-09-30 07:22:41.489 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:22:41 compute-0 nova_compute[189265]: 2025-09-30 07:22:41.490 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:22:42 compute-0 nova_compute[189265]: 2025-09-30 07:22:42.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:22:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:22:42.510 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a7:7d:c8 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0a07ba3d-468f-4279-9be2-b3ef141df6a7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '02a4831cb362481d98b354ed3bf2d113', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ea21a402-508c-472e-bd89-e4a2e8cde5bb, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=8b5421e3-6f92-4b98-bc3c-4670813d915c) old=Port_Binding(mac=['fa:16:3e:a7:7d:c8'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0a07ba3d-468f-4279-9be2-b3ef141df6a7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '02a4831cb362481d98b354ed3bf2d113', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:22:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:22:42.510 100322 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 8b5421e3-6f92-4b98-bc3c-4670813d915c in datapath 0a07ba3d-468f-4279-9be2-b3ef141df6a7 updated
Sep 30 07:22:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:22:42.511 100322 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0a07ba3d-468f-4279-9be2-b3ef141df6a7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 07:22:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:22:42.513 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[c15af736-dced-4f03-b168-1874ff19653f]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:22:42 compute-0 nova_compute[189265]: 2025-09-30 07:22:42.538 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:22:42 compute-0 nova_compute[189265]: 2025-09-30 07:22:42.539 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:22:41 up  1:20,  0 user,  load average: 0.04, 0.24, 0.38\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:22:42 compute-0 nova_compute[189265]: 2025-09-30 07:22:42.562 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:22:43 compute-0 nova_compute[189265]: 2025-09-30 07:22:43.071 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:22:43 compute-0 podman[215766]: 2025-09-30 07:22:43.462000665 +0000 UTC m=+0.049675975 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 07:22:43 compute-0 nova_compute[189265]: 2025-09-30 07:22:43.580 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:22:43 compute-0 nova_compute[189265]: 2025-09-30 07:22:43.580 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.090s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:22:45 compute-0 nova_compute[189265]: 2025-09-30 07:22:45.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:22:46 compute-0 nova_compute[189265]: 2025-09-30 07:22:46.581 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:22:46 compute-0 nova_compute[189265]: 2025-09-30 07:22:46.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:22:47 compute-0 nova_compute[189265]: 2025-09-30 07:22:47.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:22:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:22:48.170 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:51:d3:fe 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-c80db143-60d5-43f3-9add-e70102d4be73', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c80db143-60d5-43f3-9add-e70102d4be73', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2ad7bd988b6047509c2c19eb4e0dc32c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=abba67d4-7230-45e6-8b9c-34c5989d3ac4, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=b368b422-9eeb-4fe5-903c-f76193d0211d) old=Port_Binding(mac=['fa:16:3e:51:d3:fe'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-c80db143-60d5-43f3-9add-e70102d4be73', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c80db143-60d5-43f3-9add-e70102d4be73', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2ad7bd988b6047509c2c19eb4e0dc32c', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:22:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:22:48.171 100322 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port b368b422-9eeb-4fe5-903c-f76193d0211d in datapath c80db143-60d5-43f3-9add-e70102d4be73 updated
Sep 30 07:22:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:22:48.172 100322 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c80db143-60d5-43f3-9add-e70102d4be73, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 07:22:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:22:48.172 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[d7d43437-063f-4641-8129-e8004fa0fa14]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:22:50 compute-0 nova_compute[189265]: 2025-09-30 07:22:50.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:22:52 compute-0 nova_compute[189265]: 2025-09-30 07:22:52.474 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:22:52 compute-0 podman[215789]: 2025-09-30 07:22:52.541710438 +0000 UTC m=+0.128615906 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team)
Sep 30 07:22:54 compute-0 unix_chkpwd[215812]: password check failed for user (root)
Sep 30 07:22:54 compute-0 sshd-session[215810]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=141.98.10.225  user=root
Sep 30 07:22:55 compute-0 nova_compute[189265]: 2025-09-30 07:22:55.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:22:56 compute-0 sshd-session[215810]: Failed password for root from 141.98.10.225 port 41028 ssh2
Sep 30 07:22:56 compute-0 podman[215813]: 2025-09-30 07:22:56.512241542 +0000 UTC m=+0.087577800 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Sep 30 07:22:57 compute-0 nova_compute[189265]: 2025-09-30 07:22:57.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:22:58 compute-0 unix_chkpwd[215834]: password check failed for user (root)
Sep 30 07:22:59 compute-0 podman[215835]: 2025-09-30 07:22:59.51263405 +0000 UTC m=+0.089021362 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 07:22:59 compute-0 podman[215837]: 2025-09-30 07:22:59.533660387 +0000 UTC m=+0.113142499 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 07:22:59 compute-0 podman[215836]: 2025-09-30 07:22:59.536214571 +0000 UTC m=+0.110096891 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Sep 30 07:22:59 compute-0 podman[199733]: time="2025-09-30T07:22:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:22:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:22:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:22:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:22:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3007 "" "Go-http-client/1.1"
Sep 30 07:23:00 compute-0 sshd-session[215810]: Failed password for root from 141.98.10.225 port 41028 ssh2
Sep 30 07:23:00 compute-0 nova_compute[189265]: 2025-09-30 07:23:00.360 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:23:00 compute-0 unix_chkpwd[215897]: password check failed for user (root)
Sep 30 07:23:01 compute-0 openstack_network_exporter[201859]: ERROR   07:23:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:23:01 compute-0 openstack_network_exporter[201859]: ERROR   07:23:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:23:01 compute-0 openstack_network_exporter[201859]: ERROR   07:23:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:23:01 compute-0 openstack_network_exporter[201859]: ERROR   07:23:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:23:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:23:01 compute-0 openstack_network_exporter[201859]: ERROR   07:23:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:23:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:23:02 compute-0 sshd-session[215810]: Failed password for root from 141.98.10.225 port 41028 ssh2
Sep 30 07:23:02 compute-0 nova_compute[189265]: 2025-09-30 07:23:02.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:23:02 compute-0 sshd-session[215810]: Received disconnect from 141.98.10.225 port 41028:11:  [preauth]
Sep 30 07:23:02 compute-0 sshd-session[215810]: Disconnected from authenticating user root 141.98.10.225 port 41028 [preauth]
Sep 30 07:23:02 compute-0 sshd-session[215810]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=141.98.10.225  user=root
Sep 30 07:23:03 compute-0 unix_chkpwd[215900]: password check failed for user (root)
Sep 30 07:23:03 compute-0 sshd-session[215898]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=141.98.10.225  user=root
Sep 30 07:23:05 compute-0 sshd-session[215898]: Failed password for root from 141.98.10.225 port 41034 ssh2
Sep 30 07:23:05 compute-0 nova_compute[189265]: 2025-09-30 07:23:05.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:23:05 compute-0 unix_chkpwd[215901]: password check failed for user (root)
Sep 30 07:23:07 compute-0 nova_compute[189265]: 2025-09-30 07:23:07.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:23:07 compute-0 sshd-session[215898]: Failed password for root from 141.98.10.225 port 41034 ssh2
Sep 30 07:23:07 compute-0 unix_chkpwd[215902]: password check failed for user (root)
Sep 30 07:23:08 compute-0 sshd-session[215898]: Failed password for root from 141.98.10.225 port 41034 ssh2
Sep 30 07:23:09 compute-0 sshd-session[215898]: Received disconnect from 141.98.10.225 port 41034:11:  [preauth]
Sep 30 07:23:09 compute-0 sshd-session[215898]: Disconnected from authenticating user root 141.98.10.225 port 41034 [preauth]
Sep 30 07:23:09 compute-0 sshd-session[215898]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=141.98.10.225  user=root
Sep 30 07:23:10 compute-0 nova_compute[189265]: 2025-09-30 07:23:10.365 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:23:10 compute-0 unix_chkpwd[215905]: password check failed for user (root)
Sep 30 07:23:10 compute-0 sshd-session[215903]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=141.98.10.225  user=root
Sep 30 07:23:12 compute-0 nova_compute[189265]: 2025-09-30 07:23:12.518 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:23:12 compute-0 sshd-session[215903]: Failed password for root from 141.98.10.225 port 62726 ssh2
Sep 30 07:23:14 compute-0 podman[215906]: 2025-09-30 07:23:14.51184459 +0000 UTC m=+0.083389009 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 07:23:14 compute-0 unix_chkpwd[215930]: password check failed for user (root)
Sep 30 07:23:15 compute-0 nova_compute[189265]: 2025-09-30 07:23:15.367 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:23:16 compute-0 sshd-session[215903]: Failed password for root from 141.98.10.225 port 62726 ssh2
Sep 30 07:23:16 compute-0 unix_chkpwd[215931]: password check failed for user (root)
Sep 30 07:23:17 compute-0 nova_compute[189265]: 2025-09-30 07:23:17.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:23:19 compute-0 sshd-session[215903]: Failed password for root from 141.98.10.225 port 62726 ssh2
Sep 30 07:23:20 compute-0 nova_compute[189265]: 2025-09-30 07:23:20.368 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:23:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:23:20.550 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:23:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:23:20.551 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:23:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:23:20.551 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:23:21 compute-0 sshd-session[215903]: Received disconnect from 141.98.10.225 port 62726:11:  [preauth]
Sep 30 07:23:21 compute-0 sshd-session[215903]: Disconnected from authenticating user root 141.98.10.225 port 62726 [preauth]
Sep 30 07:23:21 compute-0 sshd-session[215903]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=141.98.10.225  user=root
Sep 30 07:23:22 compute-0 nova_compute[189265]: 2025-09-30 07:23:22.523 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:23:23 compute-0 podman[215934]: 2025-09-30 07:23:23.508686749 +0000 UTC m=+0.086368105 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid)
Sep 30 07:23:25 compute-0 nova_compute[189265]: 2025-09-30 07:23:25.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:23:26 compute-0 nova_compute[189265]: 2025-09-30 07:23:26.021 2 DEBUG oslo_concurrency.lockutils [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Acquiring lock "20fa7912-824c-4a80-87c5-319b83c0031b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:23:26 compute-0 nova_compute[189265]: 2025-09-30 07:23:26.021 2 DEBUG oslo_concurrency.lockutils [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lock "20fa7912-824c-4a80-87c5-319b83c0031b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:23:26 compute-0 nova_compute[189265]: 2025-09-30 07:23:26.550 2 DEBUG nova.compute.manager [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Sep 30 07:23:27 compute-0 nova_compute[189265]: 2025-09-30 07:23:27.109 2 DEBUG oslo_concurrency.lockutils [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:23:27 compute-0 nova_compute[189265]: 2025-09-30 07:23:27.110 2 DEBUG oslo_concurrency.lockutils [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:23:27 compute-0 nova_compute[189265]: 2025-09-30 07:23:27.118 2 DEBUG nova.virt.hardware [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Sep 30 07:23:27 compute-0 nova_compute[189265]: 2025-09-30 07:23:27.118 2 INFO nova.compute.claims [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Claim successful on node compute-0.ctlplane.example.com
Sep 30 07:23:27 compute-0 podman[215954]: 2025-09-30 07:23:27.507764098 +0000 UTC m=+0.083704178 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, io.openshift.expose-services=, managed_by=edpm_ansible, release=1755695350, container_name=openstack_network_exporter, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.33.7)
Sep 30 07:23:27 compute-0 nova_compute[189265]: 2025-09-30 07:23:27.524 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:23:28 compute-0 nova_compute[189265]: 2025-09-30 07:23:28.193 2 DEBUG nova.compute.provider_tree [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:23:28 compute-0 nova_compute[189265]: 2025-09-30 07:23:28.701 2 DEBUG nova.scheduler.client.report [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:23:29 compute-0 nova_compute[189265]: 2025-09-30 07:23:29.216 2 DEBUG oslo_concurrency.lockutils [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.107s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:23:29 compute-0 nova_compute[189265]: 2025-09-30 07:23:29.217 2 DEBUG nova.compute.manager [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Sep 30 07:23:29 compute-0 nova_compute[189265]: 2025-09-30 07:23:29.731 2 DEBUG nova.compute.manager [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Sep 30 07:23:29 compute-0 nova_compute[189265]: 2025-09-30 07:23:29.732 2 DEBUG nova.network.neutron [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Sep 30 07:23:29 compute-0 nova_compute[189265]: 2025-09-30 07:23:29.733 2 WARNING neutronclient.v2_0.client [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:23:29 compute-0 nova_compute[189265]: 2025-09-30 07:23:29.733 2 WARNING neutronclient.v2_0.client [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:23:29 compute-0 podman[199733]: time="2025-09-30T07:23:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:23:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:23:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:23:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:23:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 2998 "" "Go-http-client/1.1"
Sep 30 07:23:30 compute-0 nova_compute[189265]: 2025-09-30 07:23:30.243 2 INFO nova.virt.libvirt.driver [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 07:23:30 compute-0 nova_compute[189265]: 2025-09-30 07:23:30.370 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:23:30 compute-0 podman[215978]: 2025-09-30 07:23:30.522237074 +0000 UTC m=+0.093231664 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Sep 30 07:23:30 compute-0 podman[215979]: 2025-09-30 07:23:30.534014634 +0000 UTC m=+0.106299511 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Sep 30 07:23:30 compute-0 podman[215980]: 2025-09-30 07:23:30.570215269 +0000 UTC m=+0.127193874 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Sep 30 07:23:30 compute-0 nova_compute[189265]: 2025-09-30 07:23:30.756 2 DEBUG nova.compute.manager [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Sep 30 07:23:31 compute-0 openstack_network_exporter[201859]: ERROR   07:23:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:23:31 compute-0 openstack_network_exporter[201859]: ERROR   07:23:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:23:31 compute-0 openstack_network_exporter[201859]: ERROR   07:23:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:23:31 compute-0 openstack_network_exporter[201859]: ERROR   07:23:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:23:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:23:31 compute-0 openstack_network_exporter[201859]: ERROR   07:23:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:23:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:23:31 compute-0 nova_compute[189265]: 2025-09-30 07:23:31.430 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:23:31 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:23:31.431 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '1a:26:7c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2e:60:fa:91:d0:34'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:23:31 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:23:31.432 100322 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 07:23:31 compute-0 nova_compute[189265]: 2025-09-30 07:23:31.575 2 DEBUG nova.network.neutron [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Successfully created port: acb8976e-1c6c-4332-be28-4b0a44f90678 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Sep 30 07:23:31 compute-0 nova_compute[189265]: 2025-09-30 07:23:31.778 2 DEBUG nova.compute.manager [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Sep 30 07:23:31 compute-0 nova_compute[189265]: 2025-09-30 07:23:31.780 2 DEBUG nova.virt.libvirt.driver [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Sep 30 07:23:31 compute-0 nova_compute[189265]: 2025-09-30 07:23:31.781 2 INFO nova.virt.libvirt.driver [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Creating image(s)
Sep 30 07:23:31 compute-0 nova_compute[189265]: 2025-09-30 07:23:31.782 2 DEBUG oslo_concurrency.lockutils [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Acquiring lock "/var/lib/nova/instances/20fa7912-824c-4a80-87c5-319b83c0031b/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:23:31 compute-0 nova_compute[189265]: 2025-09-30 07:23:31.783 2 DEBUG oslo_concurrency.lockutils [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lock "/var/lib/nova/instances/20fa7912-824c-4a80-87c5-319b83c0031b/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:23:31 compute-0 nova_compute[189265]: 2025-09-30 07:23:31.784 2 DEBUG oslo_concurrency.lockutils [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lock "/var/lib/nova/instances/20fa7912-824c-4a80-87c5-319b83c0031b/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:23:31 compute-0 nova_compute[189265]: 2025-09-30 07:23:31.785 2 DEBUG oslo_utils.imageutils.format_inspector [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:23:31 compute-0 nova_compute[189265]: 2025-09-30 07:23:31.793 2 DEBUG oslo_utils.imageutils.format_inspector [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:23:31 compute-0 nova_compute[189265]: 2025-09-30 07:23:31.798 2 DEBUG oslo_concurrency.processutils [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:23:31 compute-0 nova_compute[189265]: 2025-09-30 07:23:31.884 2 DEBUG oslo_concurrency.processutils [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:23:31 compute-0 nova_compute[189265]: 2025-09-30 07:23:31.885 2 DEBUG oslo_concurrency.lockutils [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Acquiring lock "649c128805005f3dfb5a93843c58a367cdfe939d" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:23:31 compute-0 nova_compute[189265]: 2025-09-30 07:23:31.886 2 DEBUG oslo_concurrency.lockutils [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lock "649c128805005f3dfb5a93843c58a367cdfe939d" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:23:31 compute-0 nova_compute[189265]: 2025-09-30 07:23:31.887 2 DEBUG oslo_utils.imageutils.format_inspector [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:23:31 compute-0 nova_compute[189265]: 2025-09-30 07:23:31.891 2 DEBUG oslo_utils.imageutils.format_inspector [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:23:31 compute-0 nova_compute[189265]: 2025-09-30 07:23:31.892 2 DEBUG oslo_concurrency.processutils [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:23:31 compute-0 nova_compute[189265]: 2025-09-30 07:23:31.940 2 DEBUG oslo_concurrency.processutils [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:23:31 compute-0 nova_compute[189265]: 2025-09-30 07:23:31.941 2 DEBUG oslo_concurrency.processutils [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d,backing_fmt=raw /var/lib/nova/instances/20fa7912-824c-4a80-87c5-319b83c0031b/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:23:31 compute-0 nova_compute[189265]: 2025-09-30 07:23:31.976 2 DEBUG oslo_concurrency.processutils [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d,backing_fmt=raw /var/lib/nova/instances/20fa7912-824c-4a80-87c5-319b83c0031b/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:23:31 compute-0 nova_compute[189265]: 2025-09-30 07:23:31.977 2 DEBUG oslo_concurrency.lockutils [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lock "649c128805005f3dfb5a93843c58a367cdfe939d" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.091s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:23:31 compute-0 nova_compute[189265]: 2025-09-30 07:23:31.978 2 DEBUG oslo_concurrency.processutils [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:23:32 compute-0 nova_compute[189265]: 2025-09-30 07:23:32.043 2 DEBUG oslo_concurrency.processutils [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:23:32 compute-0 nova_compute[189265]: 2025-09-30 07:23:32.044 2 DEBUG nova.virt.disk.api [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Checking if we can resize image /var/lib/nova/instances/20fa7912-824c-4a80-87c5-319b83c0031b/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Sep 30 07:23:32 compute-0 nova_compute[189265]: 2025-09-30 07:23:32.045 2 DEBUG oslo_concurrency.processutils [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/20fa7912-824c-4a80-87c5-319b83c0031b/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:23:32 compute-0 nova_compute[189265]: 2025-09-30 07:23:32.140 2 DEBUG oslo_concurrency.processutils [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/20fa7912-824c-4a80-87c5-319b83c0031b/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:23:32 compute-0 nova_compute[189265]: 2025-09-30 07:23:32.142 2 DEBUG nova.virt.disk.api [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Cannot resize image /var/lib/nova/instances/20fa7912-824c-4a80-87c5-319b83c0031b/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Sep 30 07:23:32 compute-0 nova_compute[189265]: 2025-09-30 07:23:32.143 2 DEBUG nova.virt.libvirt.driver [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Sep 30 07:23:32 compute-0 nova_compute[189265]: 2025-09-30 07:23:32.143 2 DEBUG nova.virt.libvirt.driver [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Ensure instance console log exists: /var/lib/nova/instances/20fa7912-824c-4a80-87c5-319b83c0031b/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 07:23:32 compute-0 nova_compute[189265]: 2025-09-30 07:23:32.144 2 DEBUG oslo_concurrency.lockutils [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:23:32 compute-0 nova_compute[189265]: 2025-09-30 07:23:32.144 2 DEBUG oslo_concurrency.lockutils [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:23:32 compute-0 nova_compute[189265]: 2025-09-30 07:23:32.144 2 DEBUG oslo_concurrency.lockutils [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:23:32 compute-0 nova_compute[189265]: 2025-09-30 07:23:32.527 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:23:32 compute-0 nova_compute[189265]: 2025-09-30 07:23:32.690 2 DEBUG nova.network.neutron [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Successfully updated port: acb8976e-1c6c-4332-be28-4b0a44f90678 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Sep 30 07:23:32 compute-0 nova_compute[189265]: 2025-09-30 07:23:32.755 2 DEBUG nova.compute.manager [req-11098a9c-53ab-406e-8eb4-cdbd13f1dcb6 req-1a66a35b-15e6-4a75-b1e9-9d4ce189882d 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Received event network-changed-acb8976e-1c6c-4332-be28-4b0a44f90678 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:23:32 compute-0 nova_compute[189265]: 2025-09-30 07:23:32.755 2 DEBUG nova.compute.manager [req-11098a9c-53ab-406e-8eb4-cdbd13f1dcb6 req-1a66a35b-15e6-4a75-b1e9-9d4ce189882d 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Refreshing instance network info cache due to event network-changed-acb8976e-1c6c-4332-be28-4b0a44f90678. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 07:23:32 compute-0 nova_compute[189265]: 2025-09-30 07:23:32.756 2 DEBUG oslo_concurrency.lockutils [req-11098a9c-53ab-406e-8eb4-cdbd13f1dcb6 req-1a66a35b-15e6-4a75-b1e9-9d4ce189882d 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "refresh_cache-20fa7912-824c-4a80-87c5-319b83c0031b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:23:32 compute-0 nova_compute[189265]: 2025-09-30 07:23:32.756 2 DEBUG oslo_concurrency.lockutils [req-11098a9c-53ab-406e-8eb4-cdbd13f1dcb6 req-1a66a35b-15e6-4a75-b1e9-9d4ce189882d 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquired lock "refresh_cache-20fa7912-824c-4a80-87c5-319b83c0031b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:23:32 compute-0 nova_compute[189265]: 2025-09-30 07:23:32.756 2 DEBUG nova.network.neutron [req-11098a9c-53ab-406e-8eb4-cdbd13f1dcb6 req-1a66a35b-15e6-4a75-b1e9-9d4ce189882d 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Refreshing network info cache for port acb8976e-1c6c-4332-be28-4b0a44f90678 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 07:23:33 compute-0 nova_compute[189265]: 2025-09-30 07:23:33.197 2 DEBUG oslo_concurrency.lockutils [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Acquiring lock "refresh_cache-20fa7912-824c-4a80-87c5-319b83c0031b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:23:33 compute-0 nova_compute[189265]: 2025-09-30 07:23:33.262 2 WARNING neutronclient.v2_0.client [req-11098a9c-53ab-406e-8eb4-cdbd13f1dcb6 req-1a66a35b-15e6-4a75-b1e9-9d4ce189882d 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:23:33 compute-0 nova_compute[189265]: 2025-09-30 07:23:33.420 2 DEBUG nova.network.neutron [req-11098a9c-53ab-406e-8eb4-cdbd13f1dcb6 req-1a66a35b-15e6-4a75-b1e9-9d4ce189882d 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 07:23:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:23:33.433 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=01429670-4ea1-4dab-babc-4bc628cc01bb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:23:33 compute-0 nova_compute[189265]: 2025-09-30 07:23:33.586 2 DEBUG nova.network.neutron [req-11098a9c-53ab-406e-8eb4-cdbd13f1dcb6 req-1a66a35b-15e6-4a75-b1e9-9d4ce189882d 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:23:34 compute-0 nova_compute[189265]: 2025-09-30 07:23:34.093 2 DEBUG oslo_concurrency.lockutils [req-11098a9c-53ab-406e-8eb4-cdbd13f1dcb6 req-1a66a35b-15e6-4a75-b1e9-9d4ce189882d 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Releasing lock "refresh_cache-20fa7912-824c-4a80-87c5-319b83c0031b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:23:34 compute-0 nova_compute[189265]: 2025-09-30 07:23:34.094 2 DEBUG oslo_concurrency.lockutils [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Acquired lock "refresh_cache-20fa7912-824c-4a80-87c5-319b83c0031b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:23:34 compute-0 nova_compute[189265]: 2025-09-30 07:23:34.094 2 DEBUG nova.network.neutron [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 07:23:35 compute-0 nova_compute[189265]: 2025-09-30 07:23:35.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:23:35 compute-0 nova_compute[189265]: 2025-09-30 07:23:35.431 2 DEBUG nova.network.neutron [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 07:23:35 compute-0 nova_compute[189265]: 2025-09-30 07:23:35.698 2 WARNING neutronclient.v2_0.client [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:23:35 compute-0 nova_compute[189265]: 2025-09-30 07:23:35.783 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:23:36 compute-0 nova_compute[189265]: 2025-09-30 07:23:36.580 2 DEBUG nova.network.neutron [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Updating instance_info_cache with network_info: [{"id": "acb8976e-1c6c-4332-be28-4b0a44f90678", "address": "fa:16:3e:74:f4:50", "network": {"id": "0a07ba3d-468f-4279-9be2-b3ef141df6a7", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-465825729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02a4831cb362481d98b354ed3bf2d113", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapacb8976e-1c", "ovs_interfaceid": "acb8976e-1c6c-4332-be28-4b0a44f90678", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:23:36 compute-0 nova_compute[189265]: 2025-09-30 07:23:36.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:23:36 compute-0 nova_compute[189265]: 2025-09-30 07:23:36.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:23:37 compute-0 nova_compute[189265]: 2025-09-30 07:23:37.143 2 DEBUG oslo_concurrency.lockutils [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Releasing lock "refresh_cache-20fa7912-824c-4a80-87c5-319b83c0031b" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:23:37 compute-0 nova_compute[189265]: 2025-09-30 07:23:37.144 2 DEBUG nova.compute.manager [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Instance network_info: |[{"id": "acb8976e-1c6c-4332-be28-4b0a44f90678", "address": "fa:16:3e:74:f4:50", "network": {"id": "0a07ba3d-468f-4279-9be2-b3ef141df6a7", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-465825729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02a4831cb362481d98b354ed3bf2d113", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapacb8976e-1c", "ovs_interfaceid": "acb8976e-1c6c-4332-be28-4b0a44f90678", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Sep 30 07:23:37 compute-0 nova_compute[189265]: 2025-09-30 07:23:37.149 2 DEBUG nova.virt.libvirt.driver [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Start _get_guest_xml network_info=[{"id": "acb8976e-1c6c-4332-be28-4b0a44f90678", "address": "fa:16:3e:74:f4:50", "network": {"id": "0a07ba3d-468f-4279-9be2-b3ef141df6a7", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-465825729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02a4831cb362481d98b354ed3bf2d113", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapacb8976e-1c", "ovs_interfaceid": "acb8976e-1c6c-4332-be28-4b0a44f90678", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T07:07:59Z,direct_url=<?>,disk_format='qcow2',id=0c6b92f5-9861-49e4-862d-3ffd84520dfa,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4049964ce8244dacb50493f6676c6613',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T07:08:00Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'encryption_format': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encrypted': False, 'image_id': '0c6b92f5-9861-49e4-862d-3ffd84520dfa'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Sep 30 07:23:37 compute-0 nova_compute[189265]: 2025-09-30 07:23:37.156 2 WARNING nova.virt.libvirt.driver [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:23:37 compute-0 nova_compute[189265]: 2025-09-30 07:23:37.158 2 DEBUG nova.virt.driver [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='0c6b92f5-9861-49e4-862d-3ffd84520dfa', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteHostMaintenanceStrategy-server-592763394', uuid='20fa7912-824c-4a80-87c5-319b83c0031b'), owner=OwnerMeta(userid='071bf5838f2f473a865873b6f7846f84', username='tempest-TestExecuteHostMaintenanceStrategy-385408215-project-admin', projectid='2ad7bd988b6047509c2c19eb4e0dc32c', projectname='tempest-TestExecuteHostMaintenanceStrategy-385408215'), image=ImageMeta(id='0c6b92f5-9861-49e4-862d-3ffd84520dfa', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='ded17455-f8fe-40c7-8dae-6f0a2b208ae0', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "acb8976e-1c6c-4332-be28-4b0a44f90678", "address": "fa:16:3e:74:f4:50", "network": {"id": "0a07ba3d-468f-4279-9be2-b3ef141df6a7", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-465825729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02a4831cb362481d98b354ed3bf2d113", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapacb8976e-1c", "ovs_interfaceid": "acb8976e-1c6c-4332-be28-4b0a44f90678", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759217017.1584966) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Sep 30 07:23:37 compute-0 nova_compute[189265]: 2025-09-30 07:23:37.166 2 DEBUG nova.virt.libvirt.host [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Sep 30 07:23:37 compute-0 nova_compute[189265]: 2025-09-30 07:23:37.167 2 DEBUG nova.virt.libvirt.host [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Sep 30 07:23:37 compute-0 nova_compute[189265]: 2025-09-30 07:23:37.170 2 DEBUG nova.virt.libvirt.host [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Sep 30 07:23:37 compute-0 nova_compute[189265]: 2025-09-30 07:23:37.171 2 DEBUG nova.virt.libvirt.host [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Sep 30 07:23:37 compute-0 nova_compute[189265]: 2025-09-30 07:23:37.172 2 DEBUG nova.virt.libvirt.driver [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 07:23:37 compute-0 nova_compute[189265]: 2025-09-30 07:23:37.172 2 DEBUG nova.virt.hardware [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T07:07:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='ded17455-f8fe-40c7-8dae-6f0a2b208ae0',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T07:07:59Z,direct_url=<?>,disk_format='qcow2',id=0c6b92f5-9861-49e4-862d-3ffd84520dfa,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4049964ce8244dacb50493f6676c6613',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T07:08:00Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Sep 30 07:23:37 compute-0 nova_compute[189265]: 2025-09-30 07:23:37.173 2 DEBUG nova.virt.hardware [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Sep 30 07:23:37 compute-0 nova_compute[189265]: 2025-09-30 07:23:37.173 2 DEBUG nova.virt.hardware [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Sep 30 07:23:37 compute-0 nova_compute[189265]: 2025-09-30 07:23:37.174 2 DEBUG nova.virt.hardware [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Sep 30 07:23:37 compute-0 nova_compute[189265]: 2025-09-30 07:23:37.174 2 DEBUG nova.virt.hardware [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Sep 30 07:23:37 compute-0 nova_compute[189265]: 2025-09-30 07:23:37.175 2 DEBUG nova.virt.hardware [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Sep 30 07:23:37 compute-0 nova_compute[189265]: 2025-09-30 07:23:37.175 2 DEBUG nova.virt.hardware [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Sep 30 07:23:37 compute-0 nova_compute[189265]: 2025-09-30 07:23:37.175 2 DEBUG nova.virt.hardware [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Sep 30 07:23:37 compute-0 nova_compute[189265]: 2025-09-30 07:23:37.176 2 DEBUG nova.virt.hardware [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Sep 30 07:23:37 compute-0 nova_compute[189265]: 2025-09-30 07:23:37.176 2 DEBUG nova.virt.hardware [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Sep 30 07:23:37 compute-0 nova_compute[189265]: 2025-09-30 07:23:37.177 2 DEBUG nova.virt.hardware [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Sep 30 07:23:37 compute-0 nova_compute[189265]: 2025-09-30 07:23:37.182 2 DEBUG nova.virt.libvirt.vif [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T07:23:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-592763394',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-592763394',id=11,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2ad7bd988b6047509c2c19eb4e0dc32c',ramdisk_id='',reservation_id='r-owcaxa3t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-385408215',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-385408215-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T07:23:30Z,user_data=None,user_id='071bf5838f2f473a865873b6f7846f84',uuid=20fa7912-824c-4a80-87c5-319b83c0031b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "acb8976e-1c6c-4332-be28-4b0a44f90678", "address": "fa:16:3e:74:f4:50", "network": {"id": "0a07ba3d-468f-4279-9be2-b3ef141df6a7", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-465825729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02a4831cb362481d98b354ed3bf2d113", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapacb8976e-1c", "ovs_interfaceid": "acb8976e-1c6c-4332-be28-4b0a44f90678", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 07:23:37 compute-0 nova_compute[189265]: 2025-09-30 07:23:37.183 2 DEBUG nova.network.os_vif_util [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Converting VIF {"id": "acb8976e-1c6c-4332-be28-4b0a44f90678", "address": "fa:16:3e:74:f4:50", "network": {"id": "0a07ba3d-468f-4279-9be2-b3ef141df6a7", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-465825729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02a4831cb362481d98b354ed3bf2d113", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapacb8976e-1c", "ovs_interfaceid": "acb8976e-1c6c-4332-be28-4b0a44f90678", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:23:37 compute-0 nova_compute[189265]: 2025-09-30 07:23:37.184 2 DEBUG nova.network.os_vif_util [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:74:f4:50,bridge_name='br-int',has_traffic_filtering=True,id=acb8976e-1c6c-4332-be28-4b0a44f90678,network=Network(0a07ba3d-468f-4279-9be2-b3ef141df6a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapacb8976e-1c') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:23:37 compute-0 nova_compute[189265]: 2025-09-30 07:23:37.185 2 DEBUG nova.objects.instance [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lazy-loading 'pci_devices' on Instance uuid 20fa7912-824c-4a80-87c5-319b83c0031b obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:23:37 compute-0 nova_compute[189265]: 2025-09-30 07:23:37.529 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:23:37 compute-0 nova_compute[189265]: 2025-09-30 07:23:37.693 2 DEBUG nova.virt.libvirt.driver [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] End _get_guest_xml xml=<domain type="kvm">
Sep 30 07:23:37 compute-0 nova_compute[189265]:   <uuid>20fa7912-824c-4a80-87c5-319b83c0031b</uuid>
Sep 30 07:23:37 compute-0 nova_compute[189265]:   <name>instance-0000000b</name>
Sep 30 07:23:37 compute-0 nova_compute[189265]:   <memory>131072</memory>
Sep 30 07:23:37 compute-0 nova_compute[189265]:   <vcpu>1</vcpu>
Sep 30 07:23:37 compute-0 nova_compute[189265]:   <metadata>
Sep 30 07:23:37 compute-0 nova_compute[189265]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 07:23:37 compute-0 nova_compute[189265]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-592763394</nova:name>
Sep 30 07:23:37 compute-0 nova_compute[189265]:       <nova:creationTime>2025-09-30 07:23:37</nova:creationTime>
Sep 30 07:23:37 compute-0 nova_compute[189265]:       <nova:flavor name="m1.nano" id="ded17455-f8fe-40c7-8dae-6f0a2b208ae0">
Sep 30 07:23:37 compute-0 nova_compute[189265]:         <nova:memory>128</nova:memory>
Sep 30 07:23:37 compute-0 nova_compute[189265]:         <nova:disk>1</nova:disk>
Sep 30 07:23:37 compute-0 nova_compute[189265]:         <nova:swap>0</nova:swap>
Sep 30 07:23:37 compute-0 nova_compute[189265]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 07:23:37 compute-0 nova_compute[189265]:         <nova:vcpus>1</nova:vcpus>
Sep 30 07:23:37 compute-0 nova_compute[189265]:         <nova:extraSpecs>
Sep 30 07:23:37 compute-0 nova_compute[189265]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 07:23:37 compute-0 nova_compute[189265]:         </nova:extraSpecs>
Sep 30 07:23:37 compute-0 nova_compute[189265]:       </nova:flavor>
Sep 30 07:23:37 compute-0 nova_compute[189265]:       <nova:image uuid="0c6b92f5-9861-49e4-862d-3ffd84520dfa">
Sep 30 07:23:37 compute-0 nova_compute[189265]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 07:23:37 compute-0 nova_compute[189265]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 07:23:37 compute-0 nova_compute[189265]:         <nova:minDisk>1</nova:minDisk>
Sep 30 07:23:37 compute-0 nova_compute[189265]:         <nova:minRam>0</nova:minRam>
Sep 30 07:23:37 compute-0 nova_compute[189265]:         <nova:properties>
Sep 30 07:23:37 compute-0 nova_compute[189265]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 07:23:37 compute-0 nova_compute[189265]:         </nova:properties>
Sep 30 07:23:37 compute-0 nova_compute[189265]:       </nova:image>
Sep 30 07:23:37 compute-0 nova_compute[189265]:       <nova:owner>
Sep 30 07:23:37 compute-0 nova_compute[189265]:         <nova:user uuid="071bf5838f2f473a865873b6f7846f84">tempest-TestExecuteHostMaintenanceStrategy-385408215-project-admin</nova:user>
Sep 30 07:23:37 compute-0 nova_compute[189265]:         <nova:project uuid="2ad7bd988b6047509c2c19eb4e0dc32c">tempest-TestExecuteHostMaintenanceStrategy-385408215</nova:project>
Sep 30 07:23:37 compute-0 nova_compute[189265]:       </nova:owner>
Sep 30 07:23:37 compute-0 nova_compute[189265]:       <nova:root type="image" uuid="0c6b92f5-9861-49e4-862d-3ffd84520dfa"/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:       <nova:ports>
Sep 30 07:23:37 compute-0 nova_compute[189265]:         <nova:port uuid="acb8976e-1c6c-4332-be28-4b0a44f90678">
Sep 30 07:23:37 compute-0 nova_compute[189265]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:         </nova:port>
Sep 30 07:23:37 compute-0 nova_compute[189265]:       </nova:ports>
Sep 30 07:23:37 compute-0 nova_compute[189265]:     </nova:instance>
Sep 30 07:23:37 compute-0 nova_compute[189265]:   </metadata>
Sep 30 07:23:37 compute-0 nova_compute[189265]:   <sysinfo type="smbios">
Sep 30 07:23:37 compute-0 nova_compute[189265]:     <system>
Sep 30 07:23:37 compute-0 nova_compute[189265]:       <entry name="manufacturer">RDO</entry>
Sep 30 07:23:37 compute-0 nova_compute[189265]:       <entry name="product">OpenStack Compute</entry>
Sep 30 07:23:37 compute-0 nova_compute[189265]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 07:23:37 compute-0 nova_compute[189265]:       <entry name="serial">20fa7912-824c-4a80-87c5-319b83c0031b</entry>
Sep 30 07:23:37 compute-0 nova_compute[189265]:       <entry name="uuid">20fa7912-824c-4a80-87c5-319b83c0031b</entry>
Sep 30 07:23:37 compute-0 nova_compute[189265]:       <entry name="family">Virtual Machine</entry>
Sep 30 07:23:37 compute-0 nova_compute[189265]:     </system>
Sep 30 07:23:37 compute-0 nova_compute[189265]:   </sysinfo>
Sep 30 07:23:37 compute-0 nova_compute[189265]:   <os>
Sep 30 07:23:37 compute-0 nova_compute[189265]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 07:23:37 compute-0 nova_compute[189265]:     <boot dev="hd"/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:     <smbios mode="sysinfo"/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:   </os>
Sep 30 07:23:37 compute-0 nova_compute[189265]:   <features>
Sep 30 07:23:37 compute-0 nova_compute[189265]:     <acpi/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:     <apic/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:     <vmcoreinfo/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:   </features>
Sep 30 07:23:37 compute-0 nova_compute[189265]:   <clock offset="utc">
Sep 30 07:23:37 compute-0 nova_compute[189265]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:     <timer name="hpet" present="no"/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:   </clock>
Sep 30 07:23:37 compute-0 nova_compute[189265]:   <cpu mode="host-model" match="exact">
Sep 30 07:23:37 compute-0 nova_compute[189265]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:   </cpu>
Sep 30 07:23:37 compute-0 nova_compute[189265]:   <devices>
Sep 30 07:23:37 compute-0 nova_compute[189265]:     <disk type="file" device="disk">
Sep 30 07:23:37 compute-0 nova_compute[189265]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:       <source file="/var/lib/nova/instances/20fa7912-824c-4a80-87c5-319b83c0031b/disk"/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:       <target dev="vda" bus="virtio"/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:     </disk>
Sep 30 07:23:37 compute-0 nova_compute[189265]:     <disk type="file" device="cdrom">
Sep 30 07:23:37 compute-0 nova_compute[189265]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:       <source file="/var/lib/nova/instances/20fa7912-824c-4a80-87c5-319b83c0031b/disk.config"/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:       <target dev="sda" bus="sata"/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:     </disk>
Sep 30 07:23:37 compute-0 nova_compute[189265]:     <interface type="ethernet">
Sep 30 07:23:37 compute-0 nova_compute[189265]:       <mac address="fa:16:3e:74:f4:50"/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:       <model type="virtio"/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:       <mtu size="1442"/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:       <target dev="tapacb8976e-1c"/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:     </interface>
Sep 30 07:23:37 compute-0 nova_compute[189265]:     <serial type="pty">
Sep 30 07:23:37 compute-0 nova_compute[189265]:       <log file="/var/lib/nova/instances/20fa7912-824c-4a80-87c5-319b83c0031b/console.log" append="off"/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:     </serial>
Sep 30 07:23:37 compute-0 nova_compute[189265]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:     <video>
Sep 30 07:23:37 compute-0 nova_compute[189265]:       <model type="virtio"/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:     </video>
Sep 30 07:23:37 compute-0 nova_compute[189265]:     <input type="tablet" bus="usb"/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:     <rng model="virtio">
Sep 30 07:23:37 compute-0 nova_compute[189265]:       <backend model="random">/dev/urandom</backend>
Sep 30 07:23:37 compute-0 nova_compute[189265]:     </rng>
Sep 30 07:23:37 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root"/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:     <controller type="usb" index="0"/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 07:23:37 compute-0 nova_compute[189265]:       <stats period="10"/>
Sep 30 07:23:37 compute-0 nova_compute[189265]:     </memballoon>
Sep 30 07:23:37 compute-0 nova_compute[189265]:   </devices>
Sep 30 07:23:37 compute-0 nova_compute[189265]: </domain>
Sep 30 07:23:37 compute-0 nova_compute[189265]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Sep 30 07:23:37 compute-0 nova_compute[189265]: 2025-09-30 07:23:37.694 2 DEBUG nova.compute.manager [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Preparing to wait for external event network-vif-plugged-acb8976e-1c6c-4332-be28-4b0a44f90678 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 07:23:37 compute-0 nova_compute[189265]: 2025-09-30 07:23:37.695 2 DEBUG oslo_concurrency.lockutils [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Acquiring lock "20fa7912-824c-4a80-87c5-319b83c0031b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:23:37 compute-0 nova_compute[189265]: 2025-09-30 07:23:37.695 2 DEBUG oslo_concurrency.lockutils [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lock "20fa7912-824c-4a80-87c5-319b83c0031b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:23:37 compute-0 nova_compute[189265]: 2025-09-30 07:23:37.695 2 DEBUG oslo_concurrency.lockutils [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lock "20fa7912-824c-4a80-87c5-319b83c0031b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:23:37 compute-0 nova_compute[189265]: 2025-09-30 07:23:37.696 2 DEBUG nova.virt.libvirt.vif [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T07:23:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-592763394',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-592763394',id=11,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2ad7bd988b6047509c2c19eb4e0dc32c',ramdisk_id='',reservation_id='r-owcaxa3t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-385408215',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-385408215-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T07:23:30Z,user_data=None,user_id='071bf5838f2f473a865873b6f7846f84',uuid=20fa7912-824c-4a80-87c5-319b83c0031b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "acb8976e-1c6c-4332-be28-4b0a44f90678", "address": "fa:16:3e:74:f4:50", "network": {"id": "0a07ba3d-468f-4279-9be2-b3ef141df6a7", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-465825729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02a4831cb362481d98b354ed3bf2d113", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapacb8976e-1c", "ovs_interfaceid": "acb8976e-1c6c-4332-be28-4b0a44f90678", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 07:23:37 compute-0 nova_compute[189265]: 2025-09-30 07:23:37.696 2 DEBUG nova.network.os_vif_util [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Converting VIF {"id": "acb8976e-1c6c-4332-be28-4b0a44f90678", "address": "fa:16:3e:74:f4:50", "network": {"id": "0a07ba3d-468f-4279-9be2-b3ef141df6a7", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-465825729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02a4831cb362481d98b354ed3bf2d113", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapacb8976e-1c", "ovs_interfaceid": "acb8976e-1c6c-4332-be28-4b0a44f90678", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:23:37 compute-0 nova_compute[189265]: 2025-09-30 07:23:37.697 2 DEBUG nova.network.os_vif_util [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:74:f4:50,bridge_name='br-int',has_traffic_filtering=True,id=acb8976e-1c6c-4332-be28-4b0a44f90678,network=Network(0a07ba3d-468f-4279-9be2-b3ef141df6a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapacb8976e-1c') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:23:37 compute-0 nova_compute[189265]: 2025-09-30 07:23:37.697 2 DEBUG os_vif [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:74:f4:50,bridge_name='br-int',has_traffic_filtering=True,id=acb8976e-1c6c-4332-be28-4b0a44f90678,network=Network(0a07ba3d-468f-4279-9be2-b3ef141df6a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapacb8976e-1c') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 07:23:37 compute-0 nova_compute[189265]: 2025-09-30 07:23:37.698 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:23:37 compute-0 nova_compute[189265]: 2025-09-30 07:23:37.698 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:23:37 compute-0 nova_compute[189265]: 2025-09-30 07:23:37.698 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:23:37 compute-0 nova_compute[189265]: 2025-09-30 07:23:37.699 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:23:37 compute-0 nova_compute[189265]: 2025-09-30 07:23:37.699 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '9b4634ee-137b-5a4f-be85-fe2225e31f33', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:23:37 compute-0 nova_compute[189265]: 2025-09-30 07:23:37.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:23:37 compute-0 nova_compute[189265]: 2025-09-30 07:23:37.701 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:23:37 compute-0 nova_compute[189265]: 2025-09-30 07:23:37.703 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:23:37 compute-0 nova_compute[189265]: 2025-09-30 07:23:37.703 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapacb8976e-1c, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:23:37 compute-0 nova_compute[189265]: 2025-09-30 07:23:37.704 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapacb8976e-1c, col_values=(('qos', UUID('629553d3-ac53-497a-9bb7-cddb78935d6d')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:23:37 compute-0 nova_compute[189265]: 2025-09-30 07:23:37.704 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapacb8976e-1c, col_values=(('external_ids', {'iface-id': 'acb8976e-1c6c-4332-be28-4b0a44f90678', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:74:f4:50', 'vm-uuid': '20fa7912-824c-4a80-87c5-319b83c0031b'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:23:37 compute-0 nova_compute[189265]: 2025-09-30 07:23:37.705 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:23:37 compute-0 NetworkManager[51813]: <info>  [1759217017.7059] manager: (tapacb8976e-1c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/40)
Sep 30 07:23:37 compute-0 nova_compute[189265]: 2025-09-30 07:23:37.707 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:23:37 compute-0 nova_compute[189265]: 2025-09-30 07:23:37.714 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:23:37 compute-0 nova_compute[189265]: 2025-09-30 07:23:37.715 2 INFO os_vif [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:74:f4:50,bridge_name='br-int',has_traffic_filtering=True,id=acb8976e-1c6c-4332-be28-4b0a44f90678,network=Network(0a07ba3d-468f-4279-9be2-b3ef141df6a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapacb8976e-1c')
Sep 30 07:23:38 compute-0 nova_compute[189265]: 2025-09-30 07:23:38.304 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:23:38 compute-0 nova_compute[189265]: 2025-09-30 07:23:38.305 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:23:38 compute-0 nova_compute[189265]: 2025-09-30 07:23:38.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:23:39 compute-0 nova_compute[189265]: 2025-09-30 07:23:39.257 2 DEBUG nova.virt.libvirt.driver [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 07:23:39 compute-0 nova_compute[189265]: 2025-09-30 07:23:39.257 2 DEBUG nova.virt.libvirt.driver [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 07:23:39 compute-0 nova_compute[189265]: 2025-09-30 07:23:39.258 2 DEBUG nova.virt.libvirt.driver [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] No VIF found with MAC fa:16:3e:74:f4:50, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 07:23:39 compute-0 nova_compute[189265]: 2025-09-30 07:23:39.258 2 INFO nova.virt.libvirt.driver [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Using config drive
Sep 30 07:23:39 compute-0 nova_compute[189265]: 2025-09-30 07:23:39.768 2 WARNING neutronclient.v2_0.client [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:23:40 compute-0 nova_compute[189265]: 2025-09-30 07:23:40.274 2 INFO nova.virt.libvirt.driver [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Creating config drive at /var/lib/nova/instances/20fa7912-824c-4a80-87c5-319b83c0031b/disk.config
Sep 30 07:23:40 compute-0 nova_compute[189265]: 2025-09-30 07:23:40.283 2 DEBUG oslo_concurrency.processutils [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/20fa7912-824c-4a80-87c5-319b83c0031b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpq56qoact execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:23:40 compute-0 nova_compute[189265]: 2025-09-30 07:23:40.427 2 DEBUG oslo_concurrency.processutils [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/20fa7912-824c-4a80-87c5-319b83c0031b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpq56qoact" returned: 0 in 0.144s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:23:40 compute-0 kernel: tapacb8976e-1c: entered promiscuous mode
Sep 30 07:23:40 compute-0 NetworkManager[51813]: <info>  [1759217020.5153] manager: (tapacb8976e-1c): new Tun device (/org/freedesktop/NetworkManager/Devices/41)
Sep 30 07:23:40 compute-0 nova_compute[189265]: 2025-09-30 07:23:40.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:23:40 compute-0 ovn_controller[91436]: 2025-09-30T07:23:40Z|00106|binding|INFO|Claiming lport acb8976e-1c6c-4332-be28-4b0a44f90678 for this chassis.
Sep 30 07:23:40 compute-0 ovn_controller[91436]: 2025-09-30T07:23:40Z|00107|binding|INFO|acb8976e-1c6c-4332-be28-4b0a44f90678: Claiming fa:16:3e:74:f4:50 10.100.0.12
Sep 30 07:23:40 compute-0 nova_compute[189265]: 2025-09-30 07:23:40.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:23:40 compute-0 nova_compute[189265]: 2025-09-30 07:23:40.526 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:23:40 compute-0 nova_compute[189265]: 2025-09-30 07:23:40.534 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:23:40.544 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:74:f4:50 10.100.0.12'], port_security=['fa:16:3e:74:f4:50 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '20fa7912-824c-4a80-87c5-319b83c0031b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0a07ba3d-468f-4279-9be2-b3ef141df6a7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2ad7bd988b6047509c2c19eb4e0dc32c', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'dc82e88d-abda-4feb-bd34-afbed64798c7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ea21a402-508c-472e-bd89-e4a2e8cde5bb, chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], logical_port=acb8976e-1c6c-4332-be28-4b0a44f90678) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:23:40.546 100322 INFO neutron.agent.ovn.metadata.agent [-] Port acb8976e-1c6c-4332-be28-4b0a44f90678 in datapath 0a07ba3d-468f-4279-9be2-b3ef141df6a7 bound to our chassis
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:23:40.547 100322 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0a07ba3d-468f-4279-9be2-b3ef141df6a7
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:23:40.566 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[bf5939bd-59b9-4530-bae5-593ebaf34fba]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:23:40.567 100322 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0a07ba3d-41 in ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Sep 30 07:23:40 compute-0 systemd-udevd[216077]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:23:40.569 210650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0a07ba3d-40 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:23:40.569 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[4fd03f33-ab33-4513-8594-3935c8614f0b]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:23:40 compute-0 systemd-machined[149233]: New machine qemu-7-instance-0000000b.
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:23:40.572 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[5d430c4f-93b6-42a2-9fef-1bc7cfddd578]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:23:40 compute-0 NetworkManager[51813]: <info>  [1759217020.5910] device (tapacb8976e-1c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 07:23:40 compute-0 NetworkManager[51813]: <info>  [1759217020.5926] device (tapacb8976e-1c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:23:40.592 100440 DEBUG oslo.privsep.daemon [-] privsep: reply[081c6a64-55aa-4976-b904-4cc2a637e049]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:23:40 compute-0 ovn_controller[91436]: 2025-09-30T07:23:40Z|00108|binding|INFO|Setting lport acb8976e-1c6c-4332-be28-4b0a44f90678 ovn-installed in OVS
Sep 30 07:23:40 compute-0 ovn_controller[91436]: 2025-09-30T07:23:40Z|00109|binding|INFO|Setting lport acb8976e-1c6c-4332-be28-4b0a44f90678 up in Southbound
Sep 30 07:23:40 compute-0 nova_compute[189265]: 2025-09-30 07:23:40.612 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:23:40 compute-0 systemd[1]: Started Virtual Machine qemu-7-instance-0000000b.
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:23:40.620 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[22dffb51-037b-48f1-8336-9837acd5cfc8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:23:40.659 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[adaceffa-703e-40b2-bee7-96c81574d762]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:23:40.662 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[8429c77c-a659-4e3a-8d43-e15b426658a9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:23:40 compute-0 NetworkManager[51813]: <info>  [1759217020.6644] manager: (tap0a07ba3d-40): new Veth device (/org/freedesktop/NetworkManager/Devices/42)
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:23:40.694 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[abecae15-3283-4871-b059-dc6a774ea469]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:23:40.696 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[55e7b76e-9c4f-43a0-8499-d12f1101d923]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:23:40 compute-0 NetworkManager[51813]: <info>  [1759217020.7208] device (tap0a07ba3d-40): carrier: link connected
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:23:40.727 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[9772beba-3ce7-4ba3-9446-c499dfbb517c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:23:40.744 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[d4a4215e-0270-48d3-ae7f-0077f251ecdf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0a07ba3d-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:7d:c8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 487844, 'reachable_time': 41346, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216109, 'error': None, 'target': 'ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:23:40.760 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[d592c757-20e4-4cf6-9e91-6a4dc9148252]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea7:7dc8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 487844, 'tstamp': 487844}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216110, 'error': None, 'target': 'ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:23:40.776 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[4020d02c-ff11-44ab-a194-c4250f2457a9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0a07ba3d-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:7d:c8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 487844, 'reachable_time': 41346, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216111, 'error': None, 'target': 'ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:23:40 compute-0 nova_compute[189265]: 2025-09-30 07:23:40.776 2 DEBUG nova.compute.manager [req-cbddace6-8308-49e4-a6ac-d55bfbcc4820 req-bc3c4141-cb94-4f15-b7f9-ef9a35369af0 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Received event network-vif-plugged-acb8976e-1c6c-4332-be28-4b0a44f90678 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:23:40 compute-0 nova_compute[189265]: 2025-09-30 07:23:40.776 2 DEBUG oslo_concurrency.lockutils [req-cbddace6-8308-49e4-a6ac-d55bfbcc4820 req-bc3c4141-cb94-4f15-b7f9-ef9a35369af0 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "20fa7912-824c-4a80-87c5-319b83c0031b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:23:40 compute-0 nova_compute[189265]: 2025-09-30 07:23:40.777 2 DEBUG oslo_concurrency.lockutils [req-cbddace6-8308-49e4-a6ac-d55bfbcc4820 req-bc3c4141-cb94-4f15-b7f9-ef9a35369af0 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "20fa7912-824c-4a80-87c5-319b83c0031b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:23:40 compute-0 nova_compute[189265]: 2025-09-30 07:23:40.777 2 DEBUG oslo_concurrency.lockutils [req-cbddace6-8308-49e4-a6ac-d55bfbcc4820 req-bc3c4141-cb94-4f15-b7f9-ef9a35369af0 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "20fa7912-824c-4a80-87c5-319b83c0031b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:23:40 compute-0 nova_compute[189265]: 2025-09-30 07:23:40.777 2 DEBUG nova.compute.manager [req-cbddace6-8308-49e4-a6ac-d55bfbcc4820 req-bc3c4141-cb94-4f15-b7f9-ef9a35369af0 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Processing event network-vif-plugged-acb8976e-1c6c-4332-be28-4b0a44f90678 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 07:23:40 compute-0 nova_compute[189265]: 2025-09-30 07:23:40.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:23:40.805 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[52d1e060-0956-4ce8-8010-c6abab704968]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:23:40.856 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[4c373ce4-c153-45b9-830c-1836afed665d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:23:40.857 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0a07ba3d-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:23:40.857 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:23:40.858 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0a07ba3d-40, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:23:40 compute-0 nova_compute[189265]: 2025-09-30 07:23:40.859 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:23:40 compute-0 NetworkManager[51813]: <info>  [1759217020.8600] manager: (tap0a07ba3d-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/43)
Sep 30 07:23:40 compute-0 kernel: tap0a07ba3d-40: entered promiscuous mode
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:23:40.865 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0a07ba3d-40, col_values=(('external_ids', {'iface-id': '8b5421e3-6f92-4b98-bc3c-4670813d915c'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:23:40 compute-0 ovn_controller[91436]: 2025-09-30T07:23:40Z|00110|binding|INFO|Releasing lport 8b5421e3-6f92-4b98-bc3c-4670813d915c from this chassis (sb_readonly=0)
Sep 30 07:23:40 compute-0 nova_compute[189265]: 2025-09-30 07:23:40.866 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:23:40 compute-0 nova_compute[189265]: 2025-09-30 07:23:40.866 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:23:40 compute-0 nova_compute[189265]: 2025-09-30 07:23:40.879 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:23:40.881 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[31e41229-7648-4b3b-beab-06d8a49af6a4]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:23:40.881 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0a07ba3d-468f-4279-9be2-b3ef141df6a7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0a07ba3d-468f-4279-9be2-b3ef141df6a7.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:23:40.882 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0a07ba3d-468f-4279-9be2-b3ef141df6a7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0a07ba3d-468f-4279-9be2-b3ef141df6a7.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:23:40.882 100322 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 0a07ba3d-468f-4279-9be2-b3ef141df6a7 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:23:40.882 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0a07ba3d-468f-4279-9be2-b3ef141df6a7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0a07ba3d-468f-4279-9be2-b3ef141df6a7.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:23:40.882 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[d6ce7a3b-2083-4d0e-930f-2dd367c51af3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:23:40.883 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0a07ba3d-468f-4279-9be2-b3ef141df6a7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0a07ba3d-468f-4279-9be2-b3ef141df6a7.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:23:40.883 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[2d22d266-ee27-4e73-bd9d-7bb4f5f37137]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:23:40.884 100322 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]: global
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]:     log         /dev/log local0 debug
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]:     log-tag     haproxy-metadata-proxy-0a07ba3d-468f-4279-9be2-b3ef141df6a7
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]:     user        root
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]:     group       root
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]:     maxconn     1024
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]:     pidfile     /var/lib/neutron/external/pids/0a07ba3d-468f-4279-9be2-b3ef141df6a7.pid.haproxy
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]:     daemon
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]: 
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]: defaults
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]:     log global
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]:     mode http
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]:     option httplog
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]:     option dontlognull
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]:     option http-server-close
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]:     option forwardfor
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]:     retries                 3
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]:     timeout http-request    30s
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]:     timeout connect         30s
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]:     timeout client          32s
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]:     timeout server          32s
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]:     timeout http-keep-alive 30s
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]: 
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]: listen listener
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]:     bind 169.254.169.254:80
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]:     
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]: 
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]:     http-request add-header X-OVN-Network-ID 0a07ba3d-468f-4279-9be2-b3ef141df6a7
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Sep 30 07:23:40 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:23:40.885 100322 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7', 'env', 'PROCESS_TAG=haproxy-0a07ba3d-468f-4279-9be2-b3ef141df6a7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0a07ba3d-468f-4279-9be2-b3ef141df6a7.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Sep 30 07:23:41 compute-0 podman[216143]: 2025-09-30 07:23:41.288634307 +0000 UTC m=+0.062939129 container create f4af1648a0e6e8d19e94079987a314bb1fdfbc33bb064f1d906f5cf06e42dcae (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 07:23:41 compute-0 nova_compute[189265]: 2025-09-30 07:23:41.303 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:23:41 compute-0 nova_compute[189265]: 2025-09-30 07:23:41.304 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:23:41 compute-0 nova_compute[189265]: 2025-09-30 07:23:41.304 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:23:41 compute-0 nova_compute[189265]: 2025-09-30 07:23:41.305 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:23:41 compute-0 podman[216143]: 2025-09-30 07:23:41.2479058 +0000 UTC m=+0.022210672 image pull eeebcc09bc72f81ab45f5ab87eb8f6a7b554b949227aeec082bdb0732754ddc8 38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 07:23:41 compute-0 systemd[1]: Started libpod-conmon-f4af1648a0e6e8d19e94079987a314bb1fdfbc33bb064f1d906f5cf06e42dcae.scope.
Sep 30 07:23:41 compute-0 systemd[1]: Started libcrun container.
Sep 30 07:23:41 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/98629fffc2117081759fc3563d10d6fc234fa87f825daa7bf8d4a0f140c23aba/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 07:23:41 compute-0 podman[216143]: 2025-09-30 07:23:41.403255757 +0000 UTC m=+0.177560639 container init f4af1648a0e6e8d19e94079987a314bb1fdfbc33bb064f1d906f5cf06e42dcae (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Sep 30 07:23:41 compute-0 podman[216143]: 2025-09-30 07:23:41.410204998 +0000 UTC m=+0.184509830 container start f4af1648a0e6e8d19e94079987a314bb1fdfbc33bb064f1d906f5cf06e42dcae (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4)
Sep 30 07:23:41 compute-0 neutron-haproxy-ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7[216159]: [NOTICE]   (216163) : New worker (216165) forked
Sep 30 07:23:41 compute-0 neutron-haproxy-ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7[216159]: [NOTICE]   (216163) : Loading success.
Sep 30 07:23:42 compute-0 nova_compute[189265]: 2025-09-30 07:23:42.014 2 DEBUG nova.compute.manager [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 07:23:42 compute-0 nova_compute[189265]: 2025-09-30 07:23:42.018 2 DEBUG nova.virt.libvirt.driver [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Sep 30 07:23:42 compute-0 nova_compute[189265]: 2025-09-30 07:23:42.021 2 INFO nova.virt.libvirt.driver [-] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Instance spawned successfully.
Sep 30 07:23:42 compute-0 nova_compute[189265]: 2025-09-30 07:23:42.022 2 DEBUG nova.virt.libvirt.driver [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Sep 30 07:23:42 compute-0 nova_compute[189265]: 2025-09-30 07:23:42.413 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/20fa7912-824c-4a80-87c5-319b83c0031b/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:23:42 compute-0 nova_compute[189265]: 2025-09-30 07:23:42.483 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/20fa7912-824c-4a80-87c5-319b83c0031b/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:23:42 compute-0 nova_compute[189265]: 2025-09-30 07:23:42.485 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/20fa7912-824c-4a80-87c5-319b83c0031b/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:23:42 compute-0 nova_compute[189265]: 2025-09-30 07:23:42.567 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:23:42 compute-0 nova_compute[189265]: 2025-09-30 07:23:42.575 2 DEBUG nova.virt.libvirt.driver [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:23:42 compute-0 nova_compute[189265]: 2025-09-30 07:23:42.576 2 DEBUG nova.virt.libvirt.driver [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:23:42 compute-0 nova_compute[189265]: 2025-09-30 07:23:42.577 2 DEBUG nova.virt.libvirt.driver [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:23:42 compute-0 nova_compute[189265]: 2025-09-30 07:23:42.577 2 DEBUG nova.virt.libvirt.driver [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:23:42 compute-0 nova_compute[189265]: 2025-09-30 07:23:42.578 2 DEBUG nova.virt.libvirt.driver [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:23:42 compute-0 nova_compute[189265]: 2025-09-30 07:23:42.579 2 DEBUG nova.virt.libvirt.driver [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:23:42 compute-0 nova_compute[189265]: 2025-09-30 07:23:42.586 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/20fa7912-824c-4a80-87c5-319b83c0031b/disk --force-share --output=json" returned: 0 in 0.102s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:23:42 compute-0 nova_compute[189265]: 2025-09-30 07:23:42.705 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:23:42 compute-0 nova_compute[189265]: 2025-09-30 07:23:42.800 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:23:42 compute-0 nova_compute[189265]: 2025-09-30 07:23:42.802 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:23:42 compute-0 nova_compute[189265]: 2025-09-30 07:23:42.845 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:23:42 compute-0 nova_compute[189265]: 2025-09-30 07:23:42.845 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5812MB free_disk=73.30706787109375GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:23:42 compute-0 nova_compute[189265]: 2025-09-30 07:23:42.846 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:23:42 compute-0 nova_compute[189265]: 2025-09-30 07:23:42.846 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:23:42 compute-0 nova_compute[189265]: 2025-09-30 07:23:42.868 2 DEBUG nova.compute.manager [req-b6bcf8f1-ce16-4b49-b9fa-4e7ebe108fe0 req-c6841feb-9ebd-4014-abb0-eaafe8fd46be 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Received event network-vif-plugged-acb8976e-1c6c-4332-be28-4b0a44f90678 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:23:42 compute-0 nova_compute[189265]: 2025-09-30 07:23:42.868 2 DEBUG oslo_concurrency.lockutils [req-b6bcf8f1-ce16-4b49-b9fa-4e7ebe108fe0 req-c6841feb-9ebd-4014-abb0-eaafe8fd46be 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "20fa7912-824c-4a80-87c5-319b83c0031b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:23:42 compute-0 nova_compute[189265]: 2025-09-30 07:23:42.868 2 DEBUG oslo_concurrency.lockutils [req-b6bcf8f1-ce16-4b49-b9fa-4e7ebe108fe0 req-c6841feb-9ebd-4014-abb0-eaafe8fd46be 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "20fa7912-824c-4a80-87c5-319b83c0031b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:23:42 compute-0 nova_compute[189265]: 2025-09-30 07:23:42.869 2 DEBUG oslo_concurrency.lockutils [req-b6bcf8f1-ce16-4b49-b9fa-4e7ebe108fe0 req-c6841feb-9ebd-4014-abb0-eaafe8fd46be 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "20fa7912-824c-4a80-87c5-319b83c0031b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:23:42 compute-0 nova_compute[189265]: 2025-09-30 07:23:42.869 2 DEBUG nova.compute.manager [req-b6bcf8f1-ce16-4b49-b9fa-4e7ebe108fe0 req-c6841feb-9ebd-4014-abb0-eaafe8fd46be 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] No waiting events found dispatching network-vif-plugged-acb8976e-1c6c-4332-be28-4b0a44f90678 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:23:42 compute-0 nova_compute[189265]: 2025-09-30 07:23:42.869 2 WARNING nova.compute.manager [req-b6bcf8f1-ce16-4b49-b9fa-4e7ebe108fe0 req-c6841feb-9ebd-4014-abb0-eaafe8fd46be 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Received unexpected event network-vif-plugged-acb8976e-1c6c-4332-be28-4b0a44f90678 for instance with vm_state building and task_state spawning.
Sep 30 07:23:43 compute-0 nova_compute[189265]: 2025-09-30 07:23:43.093 2 INFO nova.compute.manager [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Took 11.31 seconds to spawn the instance on the hypervisor.
Sep 30 07:23:43 compute-0 nova_compute[189265]: 2025-09-30 07:23:43.094 2 DEBUG nova.compute.manager [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 07:23:43 compute-0 nova_compute[189265]: 2025-09-30 07:23:43.635 2 INFO nova.compute.manager [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Took 16.58 seconds to build instance.
Sep 30 07:23:43 compute-0 nova_compute[189265]: 2025-09-30 07:23:43.910 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Instance 20fa7912-824c-4a80-87c5-319b83c0031b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 07:23:43 compute-0 nova_compute[189265]: 2025-09-30 07:23:43.910 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:23:43 compute-0 nova_compute[189265]: 2025-09-30 07:23:43.911 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:23:42 up  1:21,  0 user,  load average: 0.01, 0.20, 0.36\n', 'num_instances': '1', 'num_vm_building': '1', 'num_task_spawning': '1', 'num_os_type_None': '1', 'num_proj_2ad7bd988b6047509c2c19eb4e0dc32c': '1', 'io_workload': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:23:43 compute-0 nova_compute[189265]: 2025-09-30 07:23:43.951 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:23:44 compute-0 nova_compute[189265]: 2025-09-30 07:23:44.141 2 DEBUG oslo_concurrency.lockutils [None req-927b9c93-ff1c-4281-b9da-7ba51f23f57d 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lock "20fa7912-824c-4a80-87c5-319b83c0031b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.120s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:23:44 compute-0 nova_compute[189265]: 2025-09-30 07:23:44.458 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:23:44 compute-0 nova_compute[189265]: 2025-09-30 07:23:44.969 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:23:44 compute-0 nova_compute[189265]: 2025-09-30 07:23:44.970 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.124s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:23:45 compute-0 podman[216188]: 2025-09-30 07:23:45.487021252 +0000 UTC m=+0.071838976 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 07:23:46 compute-0 nova_compute[189265]: 2025-09-30 07:23:46.966 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:23:47 compute-0 nova_compute[189265]: 2025-09-30 07:23:47.479 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:23:47 compute-0 nova_compute[189265]: 2025-09-30 07:23:47.479 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:23:47 compute-0 nova_compute[189265]: 2025-09-30 07:23:47.589 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:23:47 compute-0 nova_compute[189265]: 2025-09-30 07:23:47.707 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:23:47 compute-0 nova_compute[189265]: 2025-09-30 07:23:47.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:23:47 compute-0 nova_compute[189265]: 2025-09-30 07:23:47.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:23:47 compute-0 nova_compute[189265]: 2025-09-30 07:23:47.788 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Sep 30 07:23:48 compute-0 nova_compute[189265]: 2025-09-30 07:23:48.295 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Sep 30 07:23:52 compute-0 nova_compute[189265]: 2025-09-30 07:23:52.591 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:23:52 compute-0 nova_compute[189265]: 2025-09-30 07:23:52.708 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:23:53 compute-0 ovn_controller[91436]: 2025-09-30T07:23:53Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:74:f4:50 10.100.0.12
Sep 30 07:23:53 compute-0 ovn_controller[91436]: 2025-09-30T07:23:53Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:74:f4:50 10.100.0.12
Sep 30 07:23:54 compute-0 podman[216226]: 2025-09-30 07:23:54.50047085 +0000 UTC m=+0.083472253 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Sep 30 07:23:57 compute-0 nova_compute[189265]: 2025-09-30 07:23:57.660 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:23:57 compute-0 nova_compute[189265]: 2025-09-30 07:23:57.710 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:23:57 compute-0 nova_compute[189265]: 2025-09-30 07:23:57.923 2 DEBUG nova.virt.libvirt.driver [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d3a690a6-3b2e-4b96-9e64-dd1beeb976cf] Creating tmpfile /var/lib/nova/instances/tmpsa8owwz5 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Sep 30 07:23:57 compute-0 nova_compute[189265]: 2025-09-30 07:23:57.925 2 WARNING neutronclient.v2_0.client [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:23:57 compute-0 nova_compute[189265]: 2025-09-30 07:23:57.943 2 DEBUG nova.compute.manager [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpsa8owwz5',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Sep 30 07:23:58 compute-0 podman[216249]: 2025-09-30 07:23:58.046522117 +0000 UTC m=+0.081908107 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., io.buildah.version=1.33.7, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, release=1755695350, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-type=git, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers)
Sep 30 07:23:58 compute-0 nova_compute[189265]: 2025-09-30 07:23:58.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:23:58 compute-0 nova_compute[189265]: 2025-09-30 07:23:58.788 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Sep 30 07:23:59 compute-0 podman[199733]: time="2025-09-30T07:23:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:23:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:23:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 07:23:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:23:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3467 "" "Go-http-client/1.1"
Sep 30 07:23:59 compute-0 nova_compute[189265]: 2025-09-30 07:23:59.991 2 WARNING neutronclient.v2_0.client [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:24:01 compute-0 openstack_network_exporter[201859]: ERROR   07:24:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:24:01 compute-0 openstack_network_exporter[201859]: ERROR   07:24:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:24:01 compute-0 openstack_network_exporter[201859]: ERROR   07:24:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:24:01 compute-0 openstack_network_exporter[201859]: ERROR   07:24:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:24:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:24:01 compute-0 openstack_network_exporter[201859]: ERROR   07:24:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:24:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:24:01 compute-0 podman[216271]: 2025-09-30 07:24:01.525313442 +0000 UTC m=+0.081595499 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd)
Sep 30 07:24:01 compute-0 podman[216272]: 2025-09-30 07:24:01.525695843 +0000 UTC m=+0.071457176 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 07:24:01 compute-0 podman[216279]: 2025-09-30 07:24:01.598484637 +0000 UTC m=+0.140581634 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 07:24:02 compute-0 nova_compute[189265]: 2025-09-30 07:24:02.691 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:02 compute-0 nova_compute[189265]: 2025-09-30 07:24:02.711 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:04 compute-0 nova_compute[189265]: 2025-09-30 07:24:04.270 2 DEBUG nova.compute.manager [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpsa8owwz5',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='d3a690a6-3b2e-4b96-9e64-dd1beeb976cf',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Sep 30 07:24:05 compute-0 nova_compute[189265]: 2025-09-30 07:24:05.285 2 DEBUG oslo_concurrency.lockutils [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "refresh_cache-d3a690a6-3b2e-4b96-9e64-dd1beeb976cf" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:24:05 compute-0 nova_compute[189265]: 2025-09-30 07:24:05.286 2 DEBUG oslo_concurrency.lockutils [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquired lock "refresh_cache-d3a690a6-3b2e-4b96-9e64-dd1beeb976cf" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:24:05 compute-0 nova_compute[189265]: 2025-09-30 07:24:05.286 2 DEBUG nova.network.neutron [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d3a690a6-3b2e-4b96-9e64-dd1beeb976cf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 07:24:05 compute-0 nova_compute[189265]: 2025-09-30 07:24:05.794 2 WARNING neutronclient.v2_0.client [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:24:06 compute-0 nova_compute[189265]: 2025-09-30 07:24:06.853 2 WARNING neutronclient.v2_0.client [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:24:07 compute-0 nova_compute[189265]: 2025-09-30 07:24:07.546 2 DEBUG nova.network.neutron [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d3a690a6-3b2e-4b96-9e64-dd1beeb976cf] Updating instance_info_cache with network_info: [{"id": "2f5ca96e-ac41-4c02-9f7e-2dee444d4a8d", "address": "fa:16:3e:aa:53:84", "network": {"id": "0a07ba3d-468f-4279-9be2-b3ef141df6a7", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-465825729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02a4831cb362481d98b354ed3bf2d113", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f5ca96e-ac", "ovs_interfaceid": "2f5ca96e-ac41-4c02-9f7e-2dee444d4a8d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:24:07 compute-0 nova_compute[189265]: 2025-09-30 07:24:07.713 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:24:07 compute-0 nova_compute[189265]: 2025-09-30 07:24:07.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:24:07 compute-0 nova_compute[189265]: 2025-09-30 07:24:07.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.12/site-packages/ovs/reconnect.py:117
Sep 30 07:24:07 compute-0 nova_compute[189265]: 2025-09-30 07:24:07.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 07:24:07 compute-0 nova_compute[189265]: 2025-09-30 07:24:07.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:07 compute-0 nova_compute[189265]: 2025-09-30 07:24:07.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.12/site-packages/ovs/reconnect.py:519
Sep 30 07:24:08 compute-0 nova_compute[189265]: 2025-09-30 07:24:08.055 2 DEBUG oslo_concurrency.lockutils [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Releasing lock "refresh_cache-d3a690a6-3b2e-4b96-9e64-dd1beeb976cf" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:24:08 compute-0 nova_compute[189265]: 2025-09-30 07:24:08.071 2 DEBUG nova.virt.libvirt.driver [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d3a690a6-3b2e-4b96-9e64-dd1beeb976cf] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpsa8owwz5',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='d3a690a6-3b2e-4b96-9e64-dd1beeb976cf',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Sep 30 07:24:08 compute-0 nova_compute[189265]: 2025-09-30 07:24:08.072 2 DEBUG nova.virt.libvirt.driver [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d3a690a6-3b2e-4b96-9e64-dd1beeb976cf] Creating instance directory: /var/lib/nova/instances/d3a690a6-3b2e-4b96-9e64-dd1beeb976cf pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Sep 30 07:24:08 compute-0 nova_compute[189265]: 2025-09-30 07:24:08.072 2 DEBUG nova.virt.libvirt.driver [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d3a690a6-3b2e-4b96-9e64-dd1beeb976cf] Creating disk.info with the contents: {'/var/lib/nova/instances/d3a690a6-3b2e-4b96-9e64-dd1beeb976cf/disk': 'qcow2', '/var/lib/nova/instances/d3a690a6-3b2e-4b96-9e64-dd1beeb976cf/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Sep 30 07:24:08 compute-0 nova_compute[189265]: 2025-09-30 07:24:08.073 2 DEBUG nova.virt.libvirt.driver [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d3a690a6-3b2e-4b96-9e64-dd1beeb976cf] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Sep 30 07:24:08 compute-0 nova_compute[189265]: 2025-09-30 07:24:08.074 2 DEBUG nova.objects.instance [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lazy-loading 'trusted_certs' on Instance uuid d3a690a6-3b2e-4b96-9e64-dd1beeb976cf obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:24:08 compute-0 nova_compute[189265]: 2025-09-30 07:24:08.583 2 DEBUG oslo_utils.imageutils.format_inspector [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:24:08 compute-0 nova_compute[189265]: 2025-09-30 07:24:08.591 2 DEBUG oslo_utils.imageutils.format_inspector [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:24:08 compute-0 nova_compute[189265]: 2025-09-30 07:24:08.596 2 DEBUG oslo_concurrency.processutils [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:24:08 compute-0 nova_compute[189265]: 2025-09-30 07:24:08.684 2 DEBUG oslo_concurrency.processutils [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:24:08 compute-0 nova_compute[189265]: 2025-09-30 07:24:08.686 2 DEBUG oslo_concurrency.lockutils [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "649c128805005f3dfb5a93843c58a367cdfe939d" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:24:08 compute-0 nova_compute[189265]: 2025-09-30 07:24:08.688 2 DEBUG oslo_concurrency.lockutils [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "649c128805005f3dfb5a93843c58a367cdfe939d" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:24:08 compute-0 nova_compute[189265]: 2025-09-30 07:24:08.689 2 DEBUG oslo_utils.imageutils.format_inspector [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:24:08 compute-0 nova_compute[189265]: 2025-09-30 07:24:08.696 2 DEBUG oslo_utils.imageutils.format_inspector [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:24:08 compute-0 nova_compute[189265]: 2025-09-30 07:24:08.697 2 DEBUG oslo_concurrency.processutils [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:24:08 compute-0 nova_compute[189265]: 2025-09-30 07:24:08.770 2 DEBUG oslo_concurrency.processutils [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:24:08 compute-0 nova_compute[189265]: 2025-09-30 07:24:08.772 2 DEBUG oslo_concurrency.processutils [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d,backing_fmt=raw /var/lib/nova/instances/d3a690a6-3b2e-4b96-9e64-dd1beeb976cf/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:24:08 compute-0 nova_compute[189265]: 2025-09-30 07:24:08.807 2 DEBUG oslo_concurrency.processutils [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d,backing_fmt=raw /var/lib/nova/instances/d3a690a6-3b2e-4b96-9e64-dd1beeb976cf/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:24:08 compute-0 nova_compute[189265]: 2025-09-30 07:24:08.808 2 DEBUG oslo_concurrency.lockutils [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "649c128805005f3dfb5a93843c58a367cdfe939d" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.121s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:24:08 compute-0 nova_compute[189265]: 2025-09-30 07:24:08.809 2 DEBUG oslo_concurrency.processutils [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:24:08 compute-0 nova_compute[189265]: 2025-09-30 07:24:08.860 2 DEBUG oslo_concurrency.processutils [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:24:08 compute-0 nova_compute[189265]: 2025-09-30 07:24:08.861 2 DEBUG nova.virt.disk.api [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Checking if we can resize image /var/lib/nova/instances/d3a690a6-3b2e-4b96-9e64-dd1beeb976cf/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Sep 30 07:24:08 compute-0 nova_compute[189265]: 2025-09-30 07:24:08.862 2 DEBUG oslo_concurrency.processutils [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d3a690a6-3b2e-4b96-9e64-dd1beeb976cf/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:24:08 compute-0 nova_compute[189265]: 2025-09-30 07:24:08.914 2 DEBUG oslo_concurrency.processutils [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d3a690a6-3b2e-4b96-9e64-dd1beeb976cf/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:24:08 compute-0 nova_compute[189265]: 2025-09-30 07:24:08.915 2 DEBUG nova.virt.disk.api [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Cannot resize image /var/lib/nova/instances/d3a690a6-3b2e-4b96-9e64-dd1beeb976cf/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Sep 30 07:24:08 compute-0 nova_compute[189265]: 2025-09-30 07:24:08.915 2 DEBUG nova.objects.instance [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lazy-loading 'migration_context' on Instance uuid d3a690a6-3b2e-4b96-9e64-dd1beeb976cf obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:24:09 compute-0 nova_compute[189265]: 2025-09-30 07:24:09.440 2 DEBUG nova.objects.base [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Object Instance<d3a690a6-3b2e-4b96-9e64-dd1beeb976cf> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Sep 30 07:24:09 compute-0 nova_compute[189265]: 2025-09-30 07:24:09.441 2 DEBUG oslo_concurrency.processutils [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/d3a690a6-3b2e-4b96-9e64-dd1beeb976cf/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:24:09 compute-0 nova_compute[189265]: 2025-09-30 07:24:09.469 2 DEBUG oslo_concurrency.processutils [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/d3a690a6-3b2e-4b96-9e64-dd1beeb976cf/disk.config 497664" returned: 0 in 0.028s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:24:09 compute-0 nova_compute[189265]: 2025-09-30 07:24:09.470 2 DEBUG nova.virt.libvirt.driver [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d3a690a6-3b2e-4b96-9e64-dd1beeb976cf] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Sep 30 07:24:09 compute-0 nova_compute[189265]: 2025-09-30 07:24:09.471 2 DEBUG nova.virt.libvirt.vif [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T07:23:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-12837669',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-12837669',id=10,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T07:23:19Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='2ad7bd988b6047509c2c19eb4e0dc32c',ramdisk_id='',reservation_id='r-mf8s205z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-385408215',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-385408215-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T07:23:19Z,user_data=None,user_id='071bf5838f2f473a865873b6f7846f84',uuid=d3a690a6-3b2e-4b96-9e64-dd1beeb976cf,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2f5ca96e-ac41-4c02-9f7e-2dee444d4a8d", "address": "fa:16:3e:aa:53:84", "network": {"id": "0a07ba3d-468f-4279-9be2-b3ef141df6a7", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-465825729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02a4831cb362481d98b354ed3bf2d113", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap2f5ca96e-ac", "ovs_interfaceid": "2f5ca96e-ac41-4c02-9f7e-2dee444d4a8d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 07:24:09 compute-0 nova_compute[189265]: 2025-09-30 07:24:09.472 2 DEBUG nova.network.os_vif_util [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Converting VIF {"id": "2f5ca96e-ac41-4c02-9f7e-2dee444d4a8d", "address": "fa:16:3e:aa:53:84", "network": {"id": "0a07ba3d-468f-4279-9be2-b3ef141df6a7", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-465825729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02a4831cb362481d98b354ed3bf2d113", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap2f5ca96e-ac", "ovs_interfaceid": "2f5ca96e-ac41-4c02-9f7e-2dee444d4a8d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:24:09 compute-0 nova_compute[189265]: 2025-09-30 07:24:09.473 2 DEBUG nova.network.os_vif_util [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:aa:53:84,bridge_name='br-int',has_traffic_filtering=True,id=2f5ca96e-ac41-4c02-9f7e-2dee444d4a8d,network=Network(0a07ba3d-468f-4279-9be2-b3ef141df6a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f5ca96e-ac') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:24:09 compute-0 nova_compute[189265]: 2025-09-30 07:24:09.473 2 DEBUG os_vif [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:aa:53:84,bridge_name='br-int',has_traffic_filtering=True,id=2f5ca96e-ac41-4c02-9f7e-2dee444d4a8d,network=Network(0a07ba3d-468f-4279-9be2-b3ef141df6a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f5ca96e-ac') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 07:24:09 compute-0 nova_compute[189265]: 2025-09-30 07:24:09.474 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:09 compute-0 nova_compute[189265]: 2025-09-30 07:24:09.475 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:24:09 compute-0 nova_compute[189265]: 2025-09-30 07:24:09.475 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:24:09 compute-0 nova_compute[189265]: 2025-09-30 07:24:09.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:09 compute-0 nova_compute[189265]: 2025-09-30 07:24:09.476 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'de1712af-f91e-5863-8047-24f42e71fa65', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:24:09 compute-0 nova_compute[189265]: 2025-09-30 07:24:09.478 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:09 compute-0 nova_compute[189265]: 2025-09-30 07:24:09.479 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:09 compute-0 nova_compute[189265]: 2025-09-30 07:24:09.484 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:09 compute-0 nova_compute[189265]: 2025-09-30 07:24:09.484 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2f5ca96e-ac, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:24:09 compute-0 nova_compute[189265]: 2025-09-30 07:24:09.485 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap2f5ca96e-ac, col_values=(('qos', UUID('910f8103-e62d-424b-9c71-a0ddd0a231e0')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:24:09 compute-0 nova_compute[189265]: 2025-09-30 07:24:09.485 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap2f5ca96e-ac, col_values=(('external_ids', {'iface-id': '2f5ca96e-ac41-4c02-9f7e-2dee444d4a8d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:aa:53:84', 'vm-uuid': 'd3a690a6-3b2e-4b96-9e64-dd1beeb976cf'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:24:09 compute-0 nova_compute[189265]: 2025-09-30 07:24:09.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:09 compute-0 NetworkManager[51813]: <info>  [1759217049.4875] manager: (tap2f5ca96e-ac): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/44)
Sep 30 07:24:09 compute-0 nova_compute[189265]: 2025-09-30 07:24:09.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:24:09 compute-0 nova_compute[189265]: 2025-09-30 07:24:09.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:09 compute-0 nova_compute[189265]: 2025-09-30 07:24:09.497 2 INFO os_vif [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:aa:53:84,bridge_name='br-int',has_traffic_filtering=True,id=2f5ca96e-ac41-4c02-9f7e-2dee444d4a8d,network=Network(0a07ba3d-468f-4279-9be2-b3ef141df6a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f5ca96e-ac')
Sep 30 07:24:09 compute-0 nova_compute[189265]: 2025-09-30 07:24:09.498 2 DEBUG nova.virt.libvirt.driver [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Sep 30 07:24:09 compute-0 nova_compute[189265]: 2025-09-30 07:24:09.498 2 DEBUG nova.compute.manager [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpsa8owwz5',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='d3a690a6-3b2e-4b96-9e64-dd1beeb976cf',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Sep 30 07:24:09 compute-0 nova_compute[189265]: 2025-09-30 07:24:09.499 2 WARNING neutronclient.v2_0.client [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:24:09 compute-0 nova_compute[189265]: 2025-09-30 07:24:09.648 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:24:10 compute-0 nova_compute[189265]: 2025-09-30 07:24:10.161 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Triggering sync for uuid 20fa7912-824c-4a80-87c5-319b83c0031b _sync_power_states /usr/lib/python3.12/site-packages/nova/compute/manager.py:11020
Sep 30 07:24:10 compute-0 nova_compute[189265]: 2025-09-30 07:24:10.162 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "20fa7912-824c-4a80-87c5-319b83c0031b" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:24:10 compute-0 nova_compute[189265]: 2025-09-30 07:24:10.162 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "20fa7912-824c-4a80-87c5-319b83c0031b" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:24:10 compute-0 nova_compute[189265]: 2025-09-30 07:24:10.239 2 WARNING neutronclient.v2_0.client [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:24:10 compute-0 nova_compute[189265]: 2025-09-30 07:24:10.674 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "20fa7912-824c-4a80-87c5-319b83c0031b" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.511s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:24:10 compute-0 ovn_controller[91436]: 2025-09-30T07:24:10Z|00111|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Sep 30 07:24:11 compute-0 nova_compute[189265]: 2025-09-30 07:24:11.139 2 DEBUG nova.network.neutron [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d3a690a6-3b2e-4b96-9e64-dd1beeb976cf] Port 2f5ca96e-ac41-4c02-9f7e-2dee444d4a8d updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Sep 30 07:24:11 compute-0 nova_compute[189265]: 2025-09-30 07:24:11.155 2 DEBUG nova.compute.manager [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpsa8owwz5',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='d3a690a6-3b2e-4b96-9e64-dd1beeb976cf',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Sep 30 07:24:12 compute-0 nova_compute[189265]: 2025-09-30 07:24:12.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:14 compute-0 systemd[1]: Starting libvirt proxy daemon...
Sep 30 07:24:14 compute-0 systemd[1]: Started libvirt proxy daemon.
Sep 30 07:24:14 compute-0 nova_compute[189265]: 2025-09-30 07:24:14.487 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:14 compute-0 kernel: tap2f5ca96e-ac: entered promiscuous mode
Sep 30 07:24:14 compute-0 NetworkManager[51813]: <info>  [1759217054.5709] manager: (tap2f5ca96e-ac): new Tun device (/org/freedesktop/NetworkManager/Devices/45)
Sep 30 07:24:14 compute-0 nova_compute[189265]: 2025-09-30 07:24:14.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:14 compute-0 ovn_controller[91436]: 2025-09-30T07:24:14Z|00112|binding|INFO|Claiming lport 2f5ca96e-ac41-4c02-9f7e-2dee444d4a8d for this additional chassis.
Sep 30 07:24:14 compute-0 ovn_controller[91436]: 2025-09-30T07:24:14Z|00113|binding|INFO|2f5ca96e-ac41-4c02-9f7e-2dee444d4a8d: Claiming fa:16:3e:aa:53:84 10.100.0.11
Sep 30 07:24:14 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:14.580 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:aa:53:84 10.100.0.11'], port_security=['fa:16:3e:aa:53:84 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'd3a690a6-3b2e-4b96-9e64-dd1beeb976cf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0a07ba3d-468f-4279-9be2-b3ef141df6a7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2ad7bd988b6047509c2c19eb4e0dc32c', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'dc82e88d-abda-4feb-bd34-afbed64798c7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ea21a402-508c-472e-bd89-e4a2e8cde5bb, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=2f5ca96e-ac41-4c02-9f7e-2dee444d4a8d) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:24:14 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:14.582 100322 INFO neutron.agent.ovn.metadata.agent [-] Port 2f5ca96e-ac41-4c02-9f7e-2dee444d4a8d in datapath 0a07ba3d-468f-4279-9be2-b3ef141df6a7 unbound from our chassis
Sep 30 07:24:14 compute-0 ovn_controller[91436]: 2025-09-30T07:24:14Z|00114|binding|INFO|Setting lport 2f5ca96e-ac41-4c02-9f7e-2dee444d4a8d ovn-installed in OVS
Sep 30 07:24:14 compute-0 nova_compute[189265]: 2025-09-30 07:24:14.582 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:14 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:14.584 100322 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0a07ba3d-468f-4279-9be2-b3ef141df6a7
Sep 30 07:24:14 compute-0 nova_compute[189265]: 2025-09-30 07:24:14.585 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:14 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:14.600 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[9bae8953-df31-446f-9857-04d6ee4bb259]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:24:14 compute-0 systemd-machined[149233]: New machine qemu-8-instance-0000000a.
Sep 30 07:24:14 compute-0 systemd[1]: Started Virtual Machine qemu-8-instance-0000000a.
Sep 30 07:24:14 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:14.629 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[1984e6b7-7b78-4ea9-bf0c-42575fce2294]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:24:14 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:14.631 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[614c5370-fdae-4725-a616-4ec99e70af26]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:24:14 compute-0 systemd-udevd[216388]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 07:24:14 compute-0 NetworkManager[51813]: <info>  [1759217054.6429] device (tap2f5ca96e-ac): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 07:24:14 compute-0 NetworkManager[51813]: <info>  [1759217054.6440] device (tap2f5ca96e-ac): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 07:24:14 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:14.664 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[61ec63dc-5d77-4960-ab81-9b0c36b4bd85]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:24:14 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:14.677 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[cb295181-1854-4476-9ebe-090c1db2e376]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0a07ba3d-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:7d:c8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 487844, 'reachable_time': 41346, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216398, 'error': None, 'target': 'ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:24:14 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:14.689 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[a66c8229-77f6-42d6-93c8-7e3a15153e3f]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap0a07ba3d-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 487854, 'tstamp': 487854}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216400, 'error': None, 'target': 'ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap0a07ba3d-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 487857, 'tstamp': 487857}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216400, 'error': None, 'target': 'ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:24:14 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:14.690 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0a07ba3d-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:24:14 compute-0 nova_compute[189265]: 2025-09-30 07:24:14.692 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:14 compute-0 nova_compute[189265]: 2025-09-30 07:24:14.693 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:14 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:14.693 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0a07ba3d-40, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:24:14 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:14.694 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:24:14 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:14.694 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0a07ba3d-40, col_values=(('external_ids', {'iface-id': '8b5421e3-6f92-4b98-bc3c-4670813d915c'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:24:14 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:14.694 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:24:14 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:14.695 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[72fbdaf5-cfcf-4a8a-b86d-0c8f6249e225]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-0a07ba3d-468f-4279-9be2-b3ef141df6a7\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/0a07ba3d-468f-4279-9be2-b3ef141df6a7.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 0a07ba3d-468f-4279-9be2-b3ef141df6a7\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:24:16 compute-0 podman[216422]: 2025-09-30 07:24:16.537161634 +0000 UTC m=+0.106182608 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 07:24:16 compute-0 ovn_controller[91436]: 2025-09-30T07:24:16Z|00115|binding|INFO|Claiming lport 2f5ca96e-ac41-4c02-9f7e-2dee444d4a8d for this chassis.
Sep 30 07:24:16 compute-0 ovn_controller[91436]: 2025-09-30T07:24:16Z|00116|binding|INFO|2f5ca96e-ac41-4c02-9f7e-2dee444d4a8d: Claiming fa:16:3e:aa:53:84 10.100.0.11
Sep 30 07:24:16 compute-0 ovn_controller[91436]: 2025-09-30T07:24:16Z|00117|binding|INFO|Setting lport 2f5ca96e-ac41-4c02-9f7e-2dee444d4a8d up in Southbound
Sep 30 07:24:17 compute-0 nova_compute[189265]: 2025-09-30 07:24:17.733 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:17 compute-0 nova_compute[189265]: 2025-09-30 07:24:17.974 2 INFO nova.compute.manager [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d3a690a6-3b2e-4b96-9e64-dd1beeb976cf] Post operation of migration started
Sep 30 07:24:17 compute-0 nova_compute[189265]: 2025-09-30 07:24:17.975 2 WARNING neutronclient.v2_0.client [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:24:18 compute-0 nova_compute[189265]: 2025-09-30 07:24:18.080 2 WARNING neutronclient.v2_0.client [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:24:18 compute-0 nova_compute[189265]: 2025-09-30 07:24:18.081 2 WARNING neutronclient.v2_0.client [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:24:18 compute-0 nova_compute[189265]: 2025-09-30 07:24:18.477 2 DEBUG oslo_concurrency.lockutils [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "refresh_cache-d3a690a6-3b2e-4b96-9e64-dd1beeb976cf" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:24:18 compute-0 nova_compute[189265]: 2025-09-30 07:24:18.478 2 DEBUG oslo_concurrency.lockutils [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquired lock "refresh_cache-d3a690a6-3b2e-4b96-9e64-dd1beeb976cf" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:24:18 compute-0 nova_compute[189265]: 2025-09-30 07:24:18.478 2 DEBUG nova.network.neutron [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d3a690a6-3b2e-4b96-9e64-dd1beeb976cf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 07:24:18 compute-0 nova_compute[189265]: 2025-09-30 07:24:18.988 2 WARNING neutronclient.v2_0.client [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:24:19 compute-0 nova_compute[189265]: 2025-09-30 07:24:19.491 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:19 compute-0 nova_compute[189265]: 2025-09-30 07:24:19.732 2 WARNING neutronclient.v2_0.client [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:24:19 compute-0 nova_compute[189265]: 2025-09-30 07:24:19.886 2 DEBUG nova.network.neutron [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d3a690a6-3b2e-4b96-9e64-dd1beeb976cf] Updating instance_info_cache with network_info: [{"id": "2f5ca96e-ac41-4c02-9f7e-2dee444d4a8d", "address": "fa:16:3e:aa:53:84", "network": {"id": "0a07ba3d-468f-4279-9be2-b3ef141df6a7", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-465825729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02a4831cb362481d98b354ed3bf2d113", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f5ca96e-ac", "ovs_interfaceid": "2f5ca96e-ac41-4c02-9f7e-2dee444d4a8d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:24:20 compute-0 nova_compute[189265]: 2025-09-30 07:24:20.393 2 DEBUG oslo_concurrency.lockutils [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Releasing lock "refresh_cache-d3a690a6-3b2e-4b96-9e64-dd1beeb976cf" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:24:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:20.552 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:24:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:20.552 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:24:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:20.553 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:24:20 compute-0 nova_compute[189265]: 2025-09-30 07:24:20.916 2 DEBUG oslo_concurrency.lockutils [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:24:20 compute-0 nova_compute[189265]: 2025-09-30 07:24:20.917 2 DEBUG oslo_concurrency.lockutils [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:24:20 compute-0 nova_compute[189265]: 2025-09-30 07:24:20.918 2 DEBUG oslo_concurrency.lockutils [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:24:20 compute-0 nova_compute[189265]: 2025-09-30 07:24:20.924 2 INFO nova.virt.libvirt.driver [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d3a690a6-3b2e-4b96-9e64-dd1beeb976cf] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Sep 30 07:24:20 compute-0 virtqemud[189090]: Domain id=8 name='instance-0000000a' uuid=d3a690a6-3b2e-4b96-9e64-dd1beeb976cf is tainted: custom-monitor
Sep 30 07:24:21 compute-0 nova_compute[189265]: 2025-09-30 07:24:21.932 2 INFO nova.virt.libvirt.driver [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d3a690a6-3b2e-4b96-9e64-dd1beeb976cf] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Sep 30 07:24:22 compute-0 nova_compute[189265]: 2025-09-30 07:24:22.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:22 compute-0 nova_compute[189265]: 2025-09-30 07:24:22.938 2 INFO nova.virt.libvirt.driver [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d3a690a6-3b2e-4b96-9e64-dd1beeb976cf] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Sep 30 07:24:22 compute-0 nova_compute[189265]: 2025-09-30 07:24:22.943 2 DEBUG nova.compute.manager [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d3a690a6-3b2e-4b96-9e64-dd1beeb976cf] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 07:24:23 compute-0 nova_compute[189265]: 2025-09-30 07:24:23.454 2 DEBUG nova.objects.instance [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d3a690a6-3b2e-4b96-9e64-dd1beeb976cf] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Sep 30 07:24:24 compute-0 nova_compute[189265]: 2025-09-30 07:24:24.480 2 WARNING neutronclient.v2_0.client [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:24:24 compute-0 nova_compute[189265]: 2025-09-30 07:24:24.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:24 compute-0 nova_compute[189265]: 2025-09-30 07:24:24.598 2 WARNING neutronclient.v2_0.client [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:24:24 compute-0 nova_compute[189265]: 2025-09-30 07:24:24.598 2 WARNING neutronclient.v2_0.client [None req-eef93d60-df9f-413d-8a72-36e4bef00c53 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:24:25 compute-0 podman[216448]: 2025-09-30 07:24:25.498836388 +0000 UTC m=+0.074349119 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, container_name=iscsid, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 07:24:27 compute-0 nova_compute[189265]: 2025-09-30 07:24:27.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:28 compute-0 nova_compute[189265]: 2025-09-30 07:24:28.303 2 DEBUG oslo_concurrency.lockutils [None req-7a2d5ddc-a7f5-4648-ab0f-d5249d30e95a 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Acquiring lock "20fa7912-824c-4a80-87c5-319b83c0031b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:24:28 compute-0 nova_compute[189265]: 2025-09-30 07:24:28.304 2 DEBUG oslo_concurrency.lockutils [None req-7a2d5ddc-a7f5-4648-ab0f-d5249d30e95a 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lock "20fa7912-824c-4a80-87c5-319b83c0031b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:24:28 compute-0 nova_compute[189265]: 2025-09-30 07:24:28.304 2 DEBUG oslo_concurrency.lockutils [None req-7a2d5ddc-a7f5-4648-ab0f-d5249d30e95a 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Acquiring lock "20fa7912-824c-4a80-87c5-319b83c0031b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:24:28 compute-0 nova_compute[189265]: 2025-09-30 07:24:28.305 2 DEBUG oslo_concurrency.lockutils [None req-7a2d5ddc-a7f5-4648-ab0f-d5249d30e95a 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lock "20fa7912-824c-4a80-87c5-319b83c0031b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:24:28 compute-0 nova_compute[189265]: 2025-09-30 07:24:28.305 2 DEBUG oslo_concurrency.lockutils [None req-7a2d5ddc-a7f5-4648-ab0f-d5249d30e95a 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lock "20fa7912-824c-4a80-87c5-319b83c0031b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:24:28 compute-0 nova_compute[189265]: 2025-09-30 07:24:28.320 2 INFO nova.compute.manager [None req-7a2d5ddc-a7f5-4648-ab0f-d5249d30e95a 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Terminating instance
Sep 30 07:24:28 compute-0 podman[216469]: 2025-09-30 07:24:28.494118081 +0000 UTC m=+0.070637772 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., architecture=x86_64, name=ubi9-minimal, vcs-type=git, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41)
Sep 30 07:24:28 compute-0 nova_compute[189265]: 2025-09-30 07:24:28.848 2 DEBUG nova.compute.manager [None req-7a2d5ddc-a7f5-4648-ab0f-d5249d30e95a 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 07:24:28 compute-0 kernel: tapacb8976e-1c (unregistering): left promiscuous mode
Sep 30 07:24:28 compute-0 NetworkManager[51813]: <info>  [1759217068.8919] device (tapacb8976e-1c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 07:24:28 compute-0 ovn_controller[91436]: 2025-09-30T07:24:28Z|00118|binding|INFO|Releasing lport acb8976e-1c6c-4332-be28-4b0a44f90678 from this chassis (sb_readonly=0)
Sep 30 07:24:28 compute-0 ovn_controller[91436]: 2025-09-30T07:24:28Z|00119|binding|INFO|Setting lport acb8976e-1c6c-4332-be28-4b0a44f90678 down in Southbound
Sep 30 07:24:28 compute-0 nova_compute[189265]: 2025-09-30 07:24:28.923 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:28 compute-0 ovn_controller[91436]: 2025-09-30T07:24:28Z|00120|binding|INFO|Removing iface tapacb8976e-1c ovn-installed in OVS
Sep 30 07:24:28 compute-0 nova_compute[189265]: 2025-09-30 07:24:28.927 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:28 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:28.935 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:74:f4:50 10.100.0.12'], port_security=['fa:16:3e:74:f4:50 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '20fa7912-824c-4a80-87c5-319b83c0031b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0a07ba3d-468f-4279-9be2-b3ef141df6a7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2ad7bd988b6047509c2c19eb4e0dc32c', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'dc82e88d-abda-4feb-bd34-afbed64798c7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ea21a402-508c-472e-bd89-e4a2e8cde5bb, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], logical_port=acb8976e-1c6c-4332-be28-4b0a44f90678) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:24:28 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:28.936 100322 INFO neutron.agent.ovn.metadata.agent [-] Port acb8976e-1c6c-4332-be28-4b0a44f90678 in datapath 0a07ba3d-468f-4279-9be2-b3ef141df6a7 unbound from our chassis
Sep 30 07:24:28 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:28.938 100322 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0a07ba3d-468f-4279-9be2-b3ef141df6a7
Sep 30 07:24:28 compute-0 nova_compute[189265]: 2025-09-30 07:24:28.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:28 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:28.963 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[3fdc04a3-e2a4-4432-beb8-bbcad679cdc3]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:24:28 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Sep 30 07:24:28 compute-0 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000b.scope: Consumed 14.002s CPU time.
Sep 30 07:24:28 compute-0 systemd-machined[149233]: Machine qemu-7-instance-0000000b terminated.
Sep 30 07:24:29 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:29.012 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[9fbb8a8d-7ce9-4a4b-8dc7-bab94a86a61c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:24:29 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:29.016 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[bac15a63-45f3-477b-b47f-58102ca047ba]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:24:29 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:29.058 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[0ad75506-51a8-4c10-8b91-117ba5233f67]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:24:29 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:29.083 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[ca0eb36d-875b-4743-9a31-b052c54b1462]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0a07ba3d-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:7d:c8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 487844, 'reachable_time': 41346, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216506, 'error': None, 'target': 'ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:24:29 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:29.105 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[06a80ec4-0697-49eb-85f5-4d7770fa1268]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap0a07ba3d-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 487854, 'tstamp': 487854}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216516, 'error': None, 'target': 'ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap0a07ba3d-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 487857, 'tstamp': 487857}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216516, 'error': None, 'target': 'ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:24:29 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:29.108 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0a07ba3d-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:24:29 compute-0 nova_compute[189265]: 2025-09-30 07:24:29.110 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:29 compute-0 nova_compute[189265]: 2025-09-30 07:24:29.116 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:29 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:29.117 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0a07ba3d-40, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:24:29 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:29.117 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:24:29 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:29.118 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0a07ba3d-40, col_values=(('external_ids', {'iface-id': '8b5421e3-6f92-4b98-bc3c-4670813d915c'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:24:29 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:29.118 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:24:29 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:29.120 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[d40695f2-dcc2-4adb-90ce-035d52c00c68]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-0a07ba3d-468f-4279-9be2-b3ef141df6a7\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/0a07ba3d-468f-4279-9be2-b3ef141df6a7.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 0a07ba3d-468f-4279-9be2-b3ef141df6a7\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:24:29 compute-0 nova_compute[189265]: 2025-09-30 07:24:29.129 2 INFO nova.virt.libvirt.driver [-] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Instance destroyed successfully.
Sep 30 07:24:29 compute-0 nova_compute[189265]: 2025-09-30 07:24:29.130 2 DEBUG nova.objects.instance [None req-7a2d5ddc-a7f5-4648-ab0f-d5249d30e95a 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lazy-loading 'resources' on Instance uuid 20fa7912-824c-4a80-87c5-319b83c0031b obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:24:29 compute-0 nova_compute[189265]: 2025-09-30 07:24:29.497 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:29 compute-0 nova_compute[189265]: 2025-09-30 07:24:29.593 2 DEBUG nova.compute.manager [req-261f0067-9128-479b-bcfb-757f45cc3a7e req-45b22a37-3409-4030-a6f7-8bcbac247ef9 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Received event network-vif-unplugged-acb8976e-1c6c-4332-be28-4b0a44f90678 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:24:29 compute-0 nova_compute[189265]: 2025-09-30 07:24:29.594 2 DEBUG oslo_concurrency.lockutils [req-261f0067-9128-479b-bcfb-757f45cc3a7e req-45b22a37-3409-4030-a6f7-8bcbac247ef9 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "20fa7912-824c-4a80-87c5-319b83c0031b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:24:29 compute-0 nova_compute[189265]: 2025-09-30 07:24:29.594 2 DEBUG oslo_concurrency.lockutils [req-261f0067-9128-479b-bcfb-757f45cc3a7e req-45b22a37-3409-4030-a6f7-8bcbac247ef9 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "20fa7912-824c-4a80-87c5-319b83c0031b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:24:29 compute-0 nova_compute[189265]: 2025-09-30 07:24:29.595 2 DEBUG oslo_concurrency.lockutils [req-261f0067-9128-479b-bcfb-757f45cc3a7e req-45b22a37-3409-4030-a6f7-8bcbac247ef9 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "20fa7912-824c-4a80-87c5-319b83c0031b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:24:29 compute-0 nova_compute[189265]: 2025-09-30 07:24:29.595 2 DEBUG nova.compute.manager [req-261f0067-9128-479b-bcfb-757f45cc3a7e req-45b22a37-3409-4030-a6f7-8bcbac247ef9 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] No waiting events found dispatching network-vif-unplugged-acb8976e-1c6c-4332-be28-4b0a44f90678 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:24:29 compute-0 nova_compute[189265]: 2025-09-30 07:24:29.596 2 DEBUG nova.compute.manager [req-261f0067-9128-479b-bcfb-757f45cc3a7e req-45b22a37-3409-4030-a6f7-8bcbac247ef9 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Received event network-vif-unplugged-acb8976e-1c6c-4332-be28-4b0a44f90678 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:24:29 compute-0 nova_compute[189265]: 2025-09-30 07:24:29.637 2 DEBUG nova.virt.libvirt.vif [None req-7a2d5ddc-a7f5-4648-ab0f-d5249d30e95a 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-09-30T07:23:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-592763394',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-592763394',id=11,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T07:23:43Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2ad7bd988b6047509c2c19eb4e0dc32c',ramdisk_id='',reservation_id='r-owcaxa3t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-385408215',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-385408215-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T07:23:43Z,user_data=None,user_id='071bf5838f2f473a865873b6f7846f84',uuid=20fa7912-824c-4a80-87c5-319b83c0031b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "acb8976e-1c6c-4332-be28-4b0a44f90678", "address": "fa:16:3e:74:f4:50", "network": {"id": "0a07ba3d-468f-4279-9be2-b3ef141df6a7", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-465825729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02a4831cb362481d98b354ed3bf2d113", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapacb8976e-1c", "ovs_interfaceid": "acb8976e-1c6c-4332-be28-4b0a44f90678", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 07:24:29 compute-0 nova_compute[189265]: 2025-09-30 07:24:29.638 2 DEBUG nova.network.os_vif_util [None req-7a2d5ddc-a7f5-4648-ab0f-d5249d30e95a 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Converting VIF {"id": "acb8976e-1c6c-4332-be28-4b0a44f90678", "address": "fa:16:3e:74:f4:50", "network": {"id": "0a07ba3d-468f-4279-9be2-b3ef141df6a7", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-465825729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02a4831cb362481d98b354ed3bf2d113", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapacb8976e-1c", "ovs_interfaceid": "acb8976e-1c6c-4332-be28-4b0a44f90678", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:24:29 compute-0 nova_compute[189265]: 2025-09-30 07:24:29.639 2 DEBUG nova.network.os_vif_util [None req-7a2d5ddc-a7f5-4648-ab0f-d5249d30e95a 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:74:f4:50,bridge_name='br-int',has_traffic_filtering=True,id=acb8976e-1c6c-4332-be28-4b0a44f90678,network=Network(0a07ba3d-468f-4279-9be2-b3ef141df6a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapacb8976e-1c') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:24:29 compute-0 nova_compute[189265]: 2025-09-30 07:24:29.640 2 DEBUG os_vif [None req-7a2d5ddc-a7f5-4648-ab0f-d5249d30e95a 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:74:f4:50,bridge_name='br-int',has_traffic_filtering=True,id=acb8976e-1c6c-4332-be28-4b0a44f90678,network=Network(0a07ba3d-468f-4279-9be2-b3ef141df6a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapacb8976e-1c') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 07:24:29 compute-0 nova_compute[189265]: 2025-09-30 07:24:29.642 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:29 compute-0 nova_compute[189265]: 2025-09-30 07:24:29.642 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapacb8976e-1c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:24:29 compute-0 nova_compute[189265]: 2025-09-30 07:24:29.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:29 compute-0 nova_compute[189265]: 2025-09-30 07:24:29.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:29 compute-0 nova_compute[189265]: 2025-09-30 07:24:29.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:29 compute-0 nova_compute[189265]: 2025-09-30 07:24:29.647 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=629553d3-ac53-497a-9bb7-cddb78935d6d) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:24:29 compute-0 nova_compute[189265]: 2025-09-30 07:24:29.648 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:29 compute-0 nova_compute[189265]: 2025-09-30 07:24:29.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:29 compute-0 nova_compute[189265]: 2025-09-30 07:24:29.652 2 INFO os_vif [None req-7a2d5ddc-a7f5-4648-ab0f-d5249d30e95a 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:74:f4:50,bridge_name='br-int',has_traffic_filtering=True,id=acb8976e-1c6c-4332-be28-4b0a44f90678,network=Network(0a07ba3d-468f-4279-9be2-b3ef141df6a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapacb8976e-1c')
Sep 30 07:24:29 compute-0 nova_compute[189265]: 2025-09-30 07:24:29.653 2 INFO nova.virt.libvirt.driver [None req-7a2d5ddc-a7f5-4648-ab0f-d5249d30e95a 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Deleting instance files /var/lib/nova/instances/20fa7912-824c-4a80-87c5-319b83c0031b_del
Sep 30 07:24:29 compute-0 nova_compute[189265]: 2025-09-30 07:24:29.654 2 INFO nova.virt.libvirt.driver [None req-7a2d5ddc-a7f5-4648-ab0f-d5249d30e95a 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Deletion of /var/lib/nova/instances/20fa7912-824c-4a80-87c5-319b83c0031b_del complete
Sep 30 07:24:29 compute-0 podman[199733]: time="2025-09-30T07:24:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:24:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:24:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 07:24:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:24:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3474 "" "Go-http-client/1.1"
Sep 30 07:24:30 compute-0 nova_compute[189265]: 2025-09-30 07:24:30.168 2 INFO nova.compute.manager [None req-7a2d5ddc-a7f5-4648-ab0f-d5249d30e95a 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Took 1.32 seconds to destroy the instance on the hypervisor.
Sep 30 07:24:30 compute-0 nova_compute[189265]: 2025-09-30 07:24:30.168 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-7a2d5ddc-a7f5-4648-ab0f-d5249d30e95a 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 07:24:30 compute-0 nova_compute[189265]: 2025-09-30 07:24:30.169 2 DEBUG nova.compute.manager [-] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 07:24:30 compute-0 nova_compute[189265]: 2025-09-30 07:24:30.169 2 DEBUG nova.network.neutron [-] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 07:24:30 compute-0 nova_compute[189265]: 2025-09-30 07:24:30.170 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:24:30 compute-0 nova_compute[189265]: 2025-09-30 07:24:30.425 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:24:31 compute-0 nova_compute[189265]: 2025-09-30 07:24:31.204 2 DEBUG nova.network.neutron [-] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:24:31 compute-0 openstack_network_exporter[201859]: ERROR   07:24:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:24:31 compute-0 openstack_network_exporter[201859]: ERROR   07:24:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:24:31 compute-0 openstack_network_exporter[201859]: ERROR   07:24:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:24:31 compute-0 openstack_network_exporter[201859]: ERROR   07:24:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:24:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:24:31 compute-0 openstack_network_exporter[201859]: ERROR   07:24:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:24:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:24:31 compute-0 nova_compute[189265]: 2025-09-30 07:24:31.677 2 DEBUG nova.compute.manager [req-5062f030-7248-494d-8441-422cf2f16165 req-8002208e-c5a0-44ff-b5c2-17ec255de616 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Received event network-vif-unplugged-acb8976e-1c6c-4332-be28-4b0a44f90678 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:24:31 compute-0 nova_compute[189265]: 2025-09-30 07:24:31.677 2 DEBUG oslo_concurrency.lockutils [req-5062f030-7248-494d-8441-422cf2f16165 req-8002208e-c5a0-44ff-b5c2-17ec255de616 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "20fa7912-824c-4a80-87c5-319b83c0031b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:24:31 compute-0 nova_compute[189265]: 2025-09-30 07:24:31.678 2 DEBUG oslo_concurrency.lockutils [req-5062f030-7248-494d-8441-422cf2f16165 req-8002208e-c5a0-44ff-b5c2-17ec255de616 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "20fa7912-824c-4a80-87c5-319b83c0031b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:24:31 compute-0 nova_compute[189265]: 2025-09-30 07:24:31.678 2 DEBUG oslo_concurrency.lockutils [req-5062f030-7248-494d-8441-422cf2f16165 req-8002208e-c5a0-44ff-b5c2-17ec255de616 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "20fa7912-824c-4a80-87c5-319b83c0031b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:24:31 compute-0 nova_compute[189265]: 2025-09-30 07:24:31.679 2 DEBUG nova.compute.manager [req-5062f030-7248-494d-8441-422cf2f16165 req-8002208e-c5a0-44ff-b5c2-17ec255de616 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] No waiting events found dispatching network-vif-unplugged-acb8976e-1c6c-4332-be28-4b0a44f90678 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:24:31 compute-0 nova_compute[189265]: 2025-09-30 07:24:31.679 2 DEBUG nova.compute.manager [req-5062f030-7248-494d-8441-422cf2f16165 req-8002208e-c5a0-44ff-b5c2-17ec255de616 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Received event network-vif-unplugged-acb8976e-1c6c-4332-be28-4b0a44f90678 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:24:31 compute-0 nova_compute[189265]: 2025-09-30 07:24:31.680 2 DEBUG nova.compute.manager [req-5062f030-7248-494d-8441-422cf2f16165 req-8002208e-c5a0-44ff-b5c2-17ec255de616 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Received event network-vif-deleted-acb8976e-1c6c-4332-be28-4b0a44f90678 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:24:31 compute-0 nova_compute[189265]: 2025-09-30 07:24:31.712 2 INFO nova.compute.manager [-] [instance: 20fa7912-824c-4a80-87c5-319b83c0031b] Took 1.54 seconds to deallocate network for instance.
Sep 30 07:24:32 compute-0 nova_compute[189265]: 2025-09-30 07:24:32.234 2 DEBUG oslo_concurrency.lockutils [None req-7a2d5ddc-a7f5-4648-ab0f-d5249d30e95a 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:24:32 compute-0 nova_compute[189265]: 2025-09-30 07:24:32.235 2 DEBUG oslo_concurrency.lockutils [None req-7a2d5ddc-a7f5-4648-ab0f-d5249d30e95a 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:24:32 compute-0 podman[216525]: 2025-09-30 07:24:32.493103035 +0000 UTC m=+0.070110367 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 07:24:32 compute-0 podman[216524]: 2025-09-30 07:24:32.502782675 +0000 UTC m=+0.076785770 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Sep 30 07:24:32 compute-0 nova_compute[189265]: 2025-09-30 07:24:32.541 2 DEBUG nova.compute.provider_tree [None req-7a2d5ddc-a7f5-4648-ab0f-d5249d30e95a 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:24:32 compute-0 podman[216526]: 2025-09-30 07:24:32.596944075 +0000 UTC m=+0.159166900 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Sep 30 07:24:32 compute-0 nova_compute[189265]: 2025-09-30 07:24:32.742 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:33 compute-0 nova_compute[189265]: 2025-09-30 07:24:33.048 2 DEBUG nova.scheduler.client.report [None req-7a2d5ddc-a7f5-4648-ab0f-d5249d30e95a 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:24:33 compute-0 nova_compute[189265]: 2025-09-30 07:24:33.556 2 DEBUG oslo_concurrency.lockutils [None req-7a2d5ddc-a7f5-4648-ab0f-d5249d30e95a 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.322s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:24:33 compute-0 nova_compute[189265]: 2025-09-30 07:24:33.576 2 INFO nova.scheduler.client.report [None req-7a2d5ddc-a7f5-4648-ab0f-d5249d30e95a 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Deleted allocations for instance 20fa7912-824c-4a80-87c5-319b83c0031b
Sep 30 07:24:34 compute-0 nova_compute[189265]: 2025-09-30 07:24:34.607 2 DEBUG oslo_concurrency.lockutils [None req-7a2d5ddc-a7f5-4648-ab0f-d5249d30e95a 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lock "20fa7912-824c-4a80-87c5-319b83c0031b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.303s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:24:34 compute-0 nova_compute[189265]: 2025-09-30 07:24:34.649 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:36 compute-0 nova_compute[189265]: 2025-09-30 07:24:36.003 2 DEBUG oslo_concurrency.lockutils [None req-a75b8f7b-26d7-4b0e-b623-c1fb852cff8a 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Acquiring lock "d3a690a6-3b2e-4b96-9e64-dd1beeb976cf" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:24:36 compute-0 nova_compute[189265]: 2025-09-30 07:24:36.005 2 DEBUG oslo_concurrency.lockutils [None req-a75b8f7b-26d7-4b0e-b623-c1fb852cff8a 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lock "d3a690a6-3b2e-4b96-9e64-dd1beeb976cf" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:24:36 compute-0 nova_compute[189265]: 2025-09-30 07:24:36.006 2 DEBUG oslo_concurrency.lockutils [None req-a75b8f7b-26d7-4b0e-b623-c1fb852cff8a 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Acquiring lock "d3a690a6-3b2e-4b96-9e64-dd1beeb976cf-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:24:36 compute-0 nova_compute[189265]: 2025-09-30 07:24:36.006 2 DEBUG oslo_concurrency.lockutils [None req-a75b8f7b-26d7-4b0e-b623-c1fb852cff8a 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lock "d3a690a6-3b2e-4b96-9e64-dd1beeb976cf-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:24:36 compute-0 nova_compute[189265]: 2025-09-30 07:24:36.007 2 DEBUG oslo_concurrency.lockutils [None req-a75b8f7b-26d7-4b0e-b623-c1fb852cff8a 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lock "d3a690a6-3b2e-4b96-9e64-dd1beeb976cf-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:24:36 compute-0 nova_compute[189265]: 2025-09-30 07:24:36.144 2 INFO nova.compute.manager [None req-a75b8f7b-26d7-4b0e-b623-c1fb852cff8a 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: d3a690a6-3b2e-4b96-9e64-dd1beeb976cf] Terminating instance
Sep 30 07:24:36 compute-0 nova_compute[189265]: 2025-09-30 07:24:36.750 2 DEBUG nova.compute.manager [None req-a75b8f7b-26d7-4b0e-b623-c1fb852cff8a 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: d3a690a6-3b2e-4b96-9e64-dd1beeb976cf] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 07:24:36 compute-0 kernel: tap2f5ca96e-ac (unregistering): left promiscuous mode
Sep 30 07:24:36 compute-0 NetworkManager[51813]: <info>  [1759217076.7726] device (tap2f5ca96e-ac): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 07:24:36 compute-0 nova_compute[189265]: 2025-09-30 07:24:36.777 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:36 compute-0 ovn_controller[91436]: 2025-09-30T07:24:36Z|00121|binding|INFO|Releasing lport 2f5ca96e-ac41-4c02-9f7e-2dee444d4a8d from this chassis (sb_readonly=0)
Sep 30 07:24:36 compute-0 ovn_controller[91436]: 2025-09-30T07:24:36Z|00122|binding|INFO|Setting lport 2f5ca96e-ac41-4c02-9f7e-2dee444d4a8d down in Southbound
Sep 30 07:24:36 compute-0 ovn_controller[91436]: 2025-09-30T07:24:36Z|00123|binding|INFO|Removing iface tap2f5ca96e-ac ovn-installed in OVS
Sep 30 07:24:36 compute-0 nova_compute[189265]: 2025-09-30 07:24:36.781 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:36 compute-0 nova_compute[189265]: 2025-09-30 07:24:36.793 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:36 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Sep 30 07:24:36 compute-0 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000a.scope: Consumed 2.158s CPU time.
Sep 30 07:24:36 compute-0 systemd-machined[149233]: Machine qemu-8-instance-0000000a terminated.
Sep 30 07:24:36 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:36.903 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:aa:53:84 10.100.0.11'], port_security=['fa:16:3e:aa:53:84 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'd3a690a6-3b2e-4b96-9e64-dd1beeb976cf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0a07ba3d-468f-4279-9be2-b3ef141df6a7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2ad7bd988b6047509c2c19eb4e0dc32c', 'neutron:revision_number': '15', 'neutron:security_group_ids': 'dc82e88d-abda-4feb-bd34-afbed64798c7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ea21a402-508c-472e-bd89-e4a2e8cde5bb, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], logical_port=2f5ca96e-ac41-4c02-9f7e-2dee444d4a8d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:24:36 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:36.904 100322 INFO neutron.agent.ovn.metadata.agent [-] Port 2f5ca96e-ac41-4c02-9f7e-2dee444d4a8d in datapath 0a07ba3d-468f-4279-9be2-b3ef141df6a7 unbound from our chassis
Sep 30 07:24:36 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:36.906 100322 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0a07ba3d-468f-4279-9be2-b3ef141df6a7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 07:24:36 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:36.907 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[be6ff4d5-1f59-4a9a-84d7-f4a23bd1693a]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:24:36 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:36.907 100322 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7 namespace which is not needed anymore
Sep 30 07:24:36 compute-0 kernel: tap2f5ca96e-ac: entered promiscuous mode
Sep 30 07:24:36 compute-0 systemd-udevd[216593]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 07:24:36 compute-0 NetworkManager[51813]: <info>  [1759217076.9831] manager: (tap2f5ca96e-ac): new Tun device (/org/freedesktop/NetworkManager/Devices/46)
Sep 30 07:24:36 compute-0 kernel: tap2f5ca96e-ac (unregistering): left promiscuous mode
Sep 30 07:24:37 compute-0 nova_compute[189265]: 2025-09-30 07:24:37.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:37 compute-0 ovn_controller[91436]: 2025-09-30T07:24:37Z|00124|binding|INFO|Claiming lport 2f5ca96e-ac41-4c02-9f7e-2dee444d4a8d for this chassis.
Sep 30 07:24:37 compute-0 ovn_controller[91436]: 2025-09-30T07:24:37Z|00125|binding|INFO|2f5ca96e-ac41-4c02-9f7e-2dee444d4a8d: Claiming fa:16:3e:aa:53:84 10.100.0.11
Sep 30 07:24:37 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:37.040 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:aa:53:84 10.100.0.11'], port_security=['fa:16:3e:aa:53:84 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'd3a690a6-3b2e-4b96-9e64-dd1beeb976cf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0a07ba3d-468f-4279-9be2-b3ef141df6a7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2ad7bd988b6047509c2c19eb4e0dc32c', 'neutron:revision_number': '15', 'neutron:security_group_ids': 'dc82e88d-abda-4feb-bd34-afbed64798c7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ea21a402-508c-472e-bd89-e4a2e8cde5bb, chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], logical_port=2f5ca96e-ac41-4c02-9f7e-2dee444d4a8d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:24:37 compute-0 ovn_controller[91436]: 2025-09-30T07:24:37Z|00126|binding|INFO|Setting lport 2f5ca96e-ac41-4c02-9f7e-2dee444d4a8d ovn-installed in OVS
Sep 30 07:24:37 compute-0 ovn_controller[91436]: 2025-09-30T07:24:37Z|00127|binding|INFO|Setting lport 2f5ca96e-ac41-4c02-9f7e-2dee444d4a8d up in Southbound
Sep 30 07:24:37 compute-0 nova_compute[189265]: 2025-09-30 07:24:37.058 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:37 compute-0 nova_compute[189265]: 2025-09-30 07:24:37.091 2 INFO nova.virt.libvirt.driver [-] [instance: d3a690a6-3b2e-4b96-9e64-dd1beeb976cf] Instance destroyed successfully.
Sep 30 07:24:37 compute-0 nova_compute[189265]: 2025-09-30 07:24:37.092 2 DEBUG nova.objects.instance [None req-a75b8f7b-26d7-4b0e-b623-c1fb852cff8a 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lazy-loading 'resources' on Instance uuid d3a690a6-3b2e-4b96-9e64-dd1beeb976cf obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:24:37 compute-0 neutron-haproxy-ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7[216159]: [NOTICE]   (216163) : haproxy version is 3.0.5-8e879a5
Sep 30 07:24:37 compute-0 neutron-haproxy-ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7[216159]: [NOTICE]   (216163) : path to executable is /usr/sbin/haproxy
Sep 30 07:24:37 compute-0 neutron-haproxy-ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7[216159]: [WARNING]  (216163) : Exiting Master process...
Sep 30 07:24:37 compute-0 podman[216614]: 2025-09-30 07:24:37.129205198 +0000 UTC m=+0.060934162 container kill f4af1648a0e6e8d19e94079987a314bb1fdfbc33bb064f1d906f5cf06e42dcae (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 07:24:37 compute-0 neutron-haproxy-ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7[216159]: [ALERT]    (216163) : Current worker (216165) exited with code 143 (Terminated)
Sep 30 07:24:37 compute-0 neutron-haproxy-ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7[216159]: [WARNING]  (216163) : All workers exited. Exiting... (0)
Sep 30 07:24:37 compute-0 systemd[1]: libpod-f4af1648a0e6e8d19e94079987a314bb1fdfbc33bb064f1d906f5cf06e42dcae.scope: Deactivated successfully.
Sep 30 07:24:37 compute-0 podman[216635]: 2025-09-30 07:24:37.198805689 +0000 UTC m=+0.043222970 container died f4af1648a0e6e8d19e94079987a314bb1fdfbc33bb064f1d906f5cf06e42dcae (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4)
Sep 30 07:24:37 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f4af1648a0e6e8d19e94079987a314bb1fdfbc33bb064f1d906f5cf06e42dcae-userdata-shm.mount: Deactivated successfully.
Sep 30 07:24:37 compute-0 systemd[1]: var-lib-containers-storage-overlay-98629fffc2117081759fc3563d10d6fc234fa87f825daa7bf8d4a0f140c23aba-merged.mount: Deactivated successfully.
Sep 30 07:24:37 compute-0 podman[216635]: 2025-09-30 07:24:37.245036225 +0000 UTC m=+0.089453466 container cleanup f4af1648a0e6e8d19e94079987a314bb1fdfbc33bb064f1d906f5cf06e42dcae (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 07:24:37 compute-0 systemd[1]: libpod-conmon-f4af1648a0e6e8d19e94079987a314bb1fdfbc33bb064f1d906f5cf06e42dcae.scope: Deactivated successfully.
Sep 30 07:24:37 compute-0 podman[216637]: 2025-09-30 07:24:37.261297344 +0000 UTC m=+0.087924651 container remove f4af1648a0e6e8d19e94079987a314bb1fdfbc33bb064f1d906f5cf06e42dcae (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Sep 30 07:24:37 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:37.269 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[1bcce333-7ee4-405e-87a6-4fc39b52221a]: (4, ("Tue Sep 30 07:24:37 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7 (f4af1648a0e6e8d19e94079987a314bb1fdfbc33bb064f1d906f5cf06e42dcae)\nf4af1648a0e6e8d19e94079987a314bb1fdfbc33bb064f1d906f5cf06e42dcae\nTue Sep 30 07:24:37 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7 (f4af1648a0e6e8d19e94079987a314bb1fdfbc33bb064f1d906f5cf06e42dcae)\nf4af1648a0e6e8d19e94079987a314bb1fdfbc33bb064f1d906f5cf06e42dcae\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:24:37 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:37.270 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[3f00c971-30dc-409f-86de-936db5c7d4ce]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:24:37 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:37.271 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0a07ba3d-468f-4279-9be2-b3ef141df6a7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0a07ba3d-468f-4279-9be2-b3ef141df6a7.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:24:37 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:37.271 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[a1e45fe1-f2c9-40bf-b1c6-32b58927def0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:24:37 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:37.272 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0a07ba3d-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:24:37 compute-0 kernel: tap0a07ba3d-40: left promiscuous mode
Sep 30 07:24:37 compute-0 nova_compute[189265]: 2025-09-30 07:24:37.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:37 compute-0 ovn_controller[91436]: 2025-09-30T07:24:37Z|00128|binding|INFO|Releasing lport 2f5ca96e-ac41-4c02-9f7e-2dee444d4a8d from this chassis (sb_readonly=0)
Sep 30 07:24:37 compute-0 ovn_controller[91436]: 2025-09-30T07:24:37Z|00129|binding|INFO|Setting lport 2f5ca96e-ac41-4c02-9f7e-2dee444d4a8d down in Southbound
Sep 30 07:24:37 compute-0 ovn_controller[91436]: 2025-09-30T07:24:37Z|00130|binding|INFO|Removing iface tap2f5ca96e-ac ovn-installed in OVS
Sep 30 07:24:37 compute-0 nova_compute[189265]: 2025-09-30 07:24:37.288 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:37 compute-0 nova_compute[189265]: 2025-09-30 07:24:37.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:37 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:37.291 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[57355869-1a7e-4729-b4b0-737be7f55329]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:24:37 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:37.297 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:aa:53:84 10.100.0.11'], port_security=['fa:16:3e:aa:53:84 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'd3a690a6-3b2e-4b96-9e64-dd1beeb976cf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0a07ba3d-468f-4279-9be2-b3ef141df6a7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2ad7bd988b6047509c2c19eb4e0dc32c', 'neutron:revision_number': '15', 'neutron:security_group_ids': 'dc82e88d-abda-4feb-bd34-afbed64798c7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ea21a402-508c-472e-bd89-e4a2e8cde5bb, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], logical_port=2f5ca96e-ac41-4c02-9f7e-2dee444d4a8d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:24:37 compute-0 nova_compute[189265]: 2025-09-30 07:24:37.299 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:24:37 compute-0 nova_compute[189265]: 2025-09-30 07:24:37.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:37 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:37.326 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[683184fa-936a-4743-9b3b-242b47ff7820]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:24:37 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:37.327 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[5b693955-f421-47ad-a989-3f283c86731c]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:24:37 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:37.341 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[208a8809-eec0-4efc-aa4d-ed7e9abd8ccc]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 487837, 'reachable_time': 34999, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216670, 'error': None, 'target': 'ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:24:37 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:37.343 100440 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Sep 30 07:24:37 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:37.343 100440 DEBUG oslo.privsep.daemon [-] privsep: reply[0284b3cf-26e9-4677-83dd-6ba71ce754e2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:24:37 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:37.344 100322 INFO neutron.agent.ovn.metadata.agent [-] Port 2f5ca96e-ac41-4c02-9f7e-2dee444d4a8d in datapath 0a07ba3d-468f-4279-9be2-b3ef141df6a7 unbound from our chassis
Sep 30 07:24:37 compute-0 systemd[1]: run-netns-ovnmeta\x2d0a07ba3d\x2d468f\x2d4279\x2d9be2\x2db3ef141df6a7.mount: Deactivated successfully.
Sep 30 07:24:37 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:37.345 100322 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0a07ba3d-468f-4279-9be2-b3ef141df6a7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 07:24:37 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:37.345 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[1e1c5345-19df-4146-8ea8-f388cf83f591]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:24:37 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:37.346 100322 INFO neutron.agent.ovn.metadata.agent [-] Port 2f5ca96e-ac41-4c02-9f7e-2dee444d4a8d in datapath 0a07ba3d-468f-4279-9be2-b3ef141df6a7 unbound from our chassis
Sep 30 07:24:37 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:37.346 100322 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0a07ba3d-468f-4279-9be2-b3ef141df6a7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 07:24:37 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:37.346 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[54df1924-02c3-4be6-b0b5-d286e28b1b76]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:24:37 compute-0 nova_compute[189265]: 2025-09-30 07:24:37.598 2 DEBUG nova.virt.libvirt.vif [None req-a75b8f7b-26d7-4b0e-b623-c1fb852cff8a 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-09-30T07:23:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-12837669',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-12837669',id=10,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T07:23:19Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2ad7bd988b6047509c2c19eb4e0dc32c',ramdisk_id='',reservation_id='r-mf8s205z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',clean_attempts='1',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-385408215',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-385408215-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T07:24:24Z,user_data=None,user_id='071bf5838f2f473a865873b6f7846f84',uuid=d3a690a6-3b2e-4b96-9e64-dd1beeb976cf,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2f5ca96e-ac41-4c02-9f7e-2dee444d4a8d", "address": "fa:16:3e:aa:53:84", "network": {"id": "0a07ba3d-468f-4279-9be2-b3ef141df6a7", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-465825729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02a4831cb362481d98b354ed3bf2d113", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f5ca96e-ac", "ovs_interfaceid": "2f5ca96e-ac41-4c02-9f7e-2dee444d4a8d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 07:24:37 compute-0 nova_compute[189265]: 2025-09-30 07:24:37.599 2 DEBUG nova.network.os_vif_util [None req-a75b8f7b-26d7-4b0e-b623-c1fb852cff8a 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Converting VIF {"id": "2f5ca96e-ac41-4c02-9f7e-2dee444d4a8d", "address": "fa:16:3e:aa:53:84", "network": {"id": "0a07ba3d-468f-4279-9be2-b3ef141df6a7", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-465825729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02a4831cb362481d98b354ed3bf2d113", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f5ca96e-ac", "ovs_interfaceid": "2f5ca96e-ac41-4c02-9f7e-2dee444d4a8d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:24:37 compute-0 nova_compute[189265]: 2025-09-30 07:24:37.600 2 DEBUG nova.network.os_vif_util [None req-a75b8f7b-26d7-4b0e-b623-c1fb852cff8a 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:aa:53:84,bridge_name='br-int',has_traffic_filtering=True,id=2f5ca96e-ac41-4c02-9f7e-2dee444d4a8d,network=Network(0a07ba3d-468f-4279-9be2-b3ef141df6a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f5ca96e-ac') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:24:37 compute-0 nova_compute[189265]: 2025-09-30 07:24:37.601 2 DEBUG os_vif [None req-a75b8f7b-26d7-4b0e-b623-c1fb852cff8a 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:aa:53:84,bridge_name='br-int',has_traffic_filtering=True,id=2f5ca96e-ac41-4c02-9f7e-2dee444d4a8d,network=Network(0a07ba3d-468f-4279-9be2-b3ef141df6a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f5ca96e-ac') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 07:24:37 compute-0 nova_compute[189265]: 2025-09-30 07:24:37.603 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:37 compute-0 nova_compute[189265]: 2025-09-30 07:24:37.603 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2f5ca96e-ac, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:24:37 compute-0 nova_compute[189265]: 2025-09-30 07:24:37.605 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:37 compute-0 nova_compute[189265]: 2025-09-30 07:24:37.608 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:24:37 compute-0 nova_compute[189265]: 2025-09-30 07:24:37.610 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:37 compute-0 nova_compute[189265]: 2025-09-30 07:24:37.610 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=910f8103-e62d-424b-9c71-a0ddd0a231e0) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:24:37 compute-0 nova_compute[189265]: 2025-09-30 07:24:37.612 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:37 compute-0 nova_compute[189265]: 2025-09-30 07:24:37.613 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:37 compute-0 nova_compute[189265]: 2025-09-30 07:24:37.615 2 INFO os_vif [None req-a75b8f7b-26d7-4b0e-b623-c1fb852cff8a 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:aa:53:84,bridge_name='br-int',has_traffic_filtering=True,id=2f5ca96e-ac41-4c02-9f7e-2dee444d4a8d,network=Network(0a07ba3d-468f-4279-9be2-b3ef141df6a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f5ca96e-ac')
Sep 30 07:24:37 compute-0 nova_compute[189265]: 2025-09-30 07:24:37.616 2 INFO nova.virt.libvirt.driver [None req-a75b8f7b-26d7-4b0e-b623-c1fb852cff8a 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: d3a690a6-3b2e-4b96-9e64-dd1beeb976cf] Deleting instance files /var/lib/nova/instances/d3a690a6-3b2e-4b96-9e64-dd1beeb976cf_del
Sep 30 07:24:37 compute-0 nova_compute[189265]: 2025-09-30 07:24:37.617 2 INFO nova.virt.libvirt.driver [None req-a75b8f7b-26d7-4b0e-b623-c1fb852cff8a 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: d3a690a6-3b2e-4b96-9e64-dd1beeb976cf] Deletion of /var/lib/nova/instances/d3a690a6-3b2e-4b96-9e64-dd1beeb976cf_del complete
Sep 30 07:24:37 compute-0 nova_compute[189265]: 2025-09-30 07:24:37.650 2 DEBUG nova.compute.manager [req-b3a30f2a-4027-476b-aaf2-d7d9d90241db req-75cb5eaa-04db-4309-a441-91797696fd36 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d3a690a6-3b2e-4b96-9e64-dd1beeb976cf] Received event network-vif-unplugged-2f5ca96e-ac41-4c02-9f7e-2dee444d4a8d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:24:37 compute-0 nova_compute[189265]: 2025-09-30 07:24:37.651 2 DEBUG oslo_concurrency.lockutils [req-b3a30f2a-4027-476b-aaf2-d7d9d90241db req-75cb5eaa-04db-4309-a441-91797696fd36 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "d3a690a6-3b2e-4b96-9e64-dd1beeb976cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:24:37 compute-0 nova_compute[189265]: 2025-09-30 07:24:37.651 2 DEBUG oslo_concurrency.lockutils [req-b3a30f2a-4027-476b-aaf2-d7d9d90241db req-75cb5eaa-04db-4309-a441-91797696fd36 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "d3a690a6-3b2e-4b96-9e64-dd1beeb976cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:24:37 compute-0 nova_compute[189265]: 2025-09-30 07:24:37.651 2 DEBUG oslo_concurrency.lockutils [req-b3a30f2a-4027-476b-aaf2-d7d9d90241db req-75cb5eaa-04db-4309-a441-91797696fd36 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "d3a690a6-3b2e-4b96-9e64-dd1beeb976cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:24:37 compute-0 nova_compute[189265]: 2025-09-30 07:24:37.652 2 DEBUG nova.compute.manager [req-b3a30f2a-4027-476b-aaf2-d7d9d90241db req-75cb5eaa-04db-4309-a441-91797696fd36 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d3a690a6-3b2e-4b96-9e64-dd1beeb976cf] No waiting events found dispatching network-vif-unplugged-2f5ca96e-ac41-4c02-9f7e-2dee444d4a8d pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:24:37 compute-0 nova_compute[189265]: 2025-09-30 07:24:37.652 2 DEBUG nova.compute.manager [req-b3a30f2a-4027-476b-aaf2-d7d9d90241db req-75cb5eaa-04db-4309-a441-91797696fd36 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d3a690a6-3b2e-4b96-9e64-dd1beeb976cf] Received event network-vif-unplugged-2f5ca96e-ac41-4c02-9f7e-2dee444d4a8d for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:24:37 compute-0 nova_compute[189265]: 2025-09-30 07:24:37.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:37 compute-0 nova_compute[189265]: 2025-09-30 07:24:37.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:24:38 compute-0 nova_compute[189265]: 2025-09-30 07:24:38.130 2 INFO nova.compute.manager [None req-a75b8f7b-26d7-4b0e-b623-c1fb852cff8a 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: d3a690a6-3b2e-4b96-9e64-dd1beeb976cf] Took 1.38 seconds to destroy the instance on the hypervisor.
Sep 30 07:24:38 compute-0 nova_compute[189265]: 2025-09-30 07:24:38.131 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-a75b8f7b-26d7-4b0e-b623-c1fb852cff8a 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 07:24:38 compute-0 nova_compute[189265]: 2025-09-30 07:24:38.131 2 DEBUG nova.compute.manager [-] [instance: d3a690a6-3b2e-4b96-9e64-dd1beeb976cf] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 07:24:38 compute-0 nova_compute[189265]: 2025-09-30 07:24:38.132 2 DEBUG nova.network.neutron [-] [instance: d3a690a6-3b2e-4b96-9e64-dd1beeb976cf] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 07:24:38 compute-0 nova_compute[189265]: 2025-09-30 07:24:38.132 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:24:38 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:38.499 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '1a:26:7c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2e:60:fa:91:d0:34'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:24:38 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:38.500 100322 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 07:24:38 compute-0 nova_compute[189265]: 2025-09-30 07:24:38.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:38 compute-0 nova_compute[189265]: 2025-09-30 07:24:38.521 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:24:38 compute-0 nova_compute[189265]: 2025-09-30 07:24:38.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:24:38 compute-0 nova_compute[189265]: 2025-09-30 07:24:38.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:24:38 compute-0 nova_compute[189265]: 2025-09-30 07:24:38.788 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:24:39 compute-0 nova_compute[189265]: 2025-09-30 07:24:39.438 2 DEBUG nova.network.neutron [-] [instance: d3a690a6-3b2e-4b96-9e64-dd1beeb976cf] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:24:39 compute-0 nova_compute[189265]: 2025-09-30 07:24:39.739 2 DEBUG nova.compute.manager [req-d6a9b156-ff7f-47ae-ab0d-94f6a4bcbe1e req-a26b27ad-9b9b-449f-8361-a5ad2075194c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d3a690a6-3b2e-4b96-9e64-dd1beeb976cf] Received event network-vif-unplugged-2f5ca96e-ac41-4c02-9f7e-2dee444d4a8d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:24:39 compute-0 nova_compute[189265]: 2025-09-30 07:24:39.739 2 DEBUG oslo_concurrency.lockutils [req-d6a9b156-ff7f-47ae-ab0d-94f6a4bcbe1e req-a26b27ad-9b9b-449f-8361-a5ad2075194c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "d3a690a6-3b2e-4b96-9e64-dd1beeb976cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:24:39 compute-0 nova_compute[189265]: 2025-09-30 07:24:39.740 2 DEBUG oslo_concurrency.lockutils [req-d6a9b156-ff7f-47ae-ab0d-94f6a4bcbe1e req-a26b27ad-9b9b-449f-8361-a5ad2075194c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "d3a690a6-3b2e-4b96-9e64-dd1beeb976cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:24:39 compute-0 nova_compute[189265]: 2025-09-30 07:24:39.740 2 DEBUG oslo_concurrency.lockutils [req-d6a9b156-ff7f-47ae-ab0d-94f6a4bcbe1e req-a26b27ad-9b9b-449f-8361-a5ad2075194c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "d3a690a6-3b2e-4b96-9e64-dd1beeb976cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:24:39 compute-0 nova_compute[189265]: 2025-09-30 07:24:39.741 2 DEBUG nova.compute.manager [req-d6a9b156-ff7f-47ae-ab0d-94f6a4bcbe1e req-a26b27ad-9b9b-449f-8361-a5ad2075194c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d3a690a6-3b2e-4b96-9e64-dd1beeb976cf] No waiting events found dispatching network-vif-unplugged-2f5ca96e-ac41-4c02-9f7e-2dee444d4a8d pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:24:39 compute-0 nova_compute[189265]: 2025-09-30 07:24:39.741 2 DEBUG nova.compute.manager [req-d6a9b156-ff7f-47ae-ab0d-94f6a4bcbe1e req-a26b27ad-9b9b-449f-8361-a5ad2075194c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d3a690a6-3b2e-4b96-9e64-dd1beeb976cf] Received event network-vif-unplugged-2f5ca96e-ac41-4c02-9f7e-2dee444d4a8d for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:24:39 compute-0 nova_compute[189265]: 2025-09-30 07:24:39.742 2 DEBUG nova.compute.manager [req-d6a9b156-ff7f-47ae-ab0d-94f6a4bcbe1e req-a26b27ad-9b9b-449f-8361-a5ad2075194c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d3a690a6-3b2e-4b96-9e64-dd1beeb976cf] Received event network-vif-plugged-2f5ca96e-ac41-4c02-9f7e-2dee444d4a8d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:24:39 compute-0 nova_compute[189265]: 2025-09-30 07:24:39.742 2 DEBUG oslo_concurrency.lockutils [req-d6a9b156-ff7f-47ae-ab0d-94f6a4bcbe1e req-a26b27ad-9b9b-449f-8361-a5ad2075194c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "d3a690a6-3b2e-4b96-9e64-dd1beeb976cf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:24:39 compute-0 nova_compute[189265]: 2025-09-30 07:24:39.743 2 DEBUG oslo_concurrency.lockutils [req-d6a9b156-ff7f-47ae-ab0d-94f6a4bcbe1e req-a26b27ad-9b9b-449f-8361-a5ad2075194c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "d3a690a6-3b2e-4b96-9e64-dd1beeb976cf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:24:39 compute-0 nova_compute[189265]: 2025-09-30 07:24:39.743 2 DEBUG oslo_concurrency.lockutils [req-d6a9b156-ff7f-47ae-ab0d-94f6a4bcbe1e req-a26b27ad-9b9b-449f-8361-a5ad2075194c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "d3a690a6-3b2e-4b96-9e64-dd1beeb976cf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:24:39 compute-0 nova_compute[189265]: 2025-09-30 07:24:39.744 2 DEBUG nova.compute.manager [req-d6a9b156-ff7f-47ae-ab0d-94f6a4bcbe1e req-a26b27ad-9b9b-449f-8361-a5ad2075194c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d3a690a6-3b2e-4b96-9e64-dd1beeb976cf] No waiting events found dispatching network-vif-plugged-2f5ca96e-ac41-4c02-9f7e-2dee444d4a8d pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:24:39 compute-0 nova_compute[189265]: 2025-09-30 07:24:39.744 2 WARNING nova.compute.manager [req-d6a9b156-ff7f-47ae-ab0d-94f6a4bcbe1e req-a26b27ad-9b9b-449f-8361-a5ad2075194c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d3a690a6-3b2e-4b96-9e64-dd1beeb976cf] Received unexpected event network-vif-plugged-2f5ca96e-ac41-4c02-9f7e-2dee444d4a8d for instance with vm_state active and task_state deleting.
Sep 30 07:24:39 compute-0 nova_compute[189265]: 2025-09-30 07:24:39.745 2 DEBUG nova.compute.manager [req-d6a9b156-ff7f-47ae-ab0d-94f6a4bcbe1e req-a26b27ad-9b9b-449f-8361-a5ad2075194c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d3a690a6-3b2e-4b96-9e64-dd1beeb976cf] Received event network-vif-deleted-2f5ca96e-ac41-4c02-9f7e-2dee444d4a8d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:24:39 compute-0 nova_compute[189265]: 2025-09-30 07:24:39.944 2 INFO nova.compute.manager [-] [instance: d3a690a6-3b2e-4b96-9e64-dd1beeb976cf] Took 1.81 seconds to deallocate network for instance.
Sep 30 07:24:40 compute-0 nova_compute[189265]: 2025-09-30 07:24:40.465 2 DEBUG oslo_concurrency.lockutils [None req-a75b8f7b-26d7-4b0e-b623-c1fb852cff8a 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:24:40 compute-0 nova_compute[189265]: 2025-09-30 07:24:40.465 2 DEBUG oslo_concurrency.lockutils [None req-a75b8f7b-26d7-4b0e-b623-c1fb852cff8a 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:24:40 compute-0 nova_compute[189265]: 2025-09-30 07:24:40.471 2 DEBUG oslo_concurrency.lockutils [None req-a75b8f7b-26d7-4b0e-b623-c1fb852cff8a 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:24:40 compute-0 nova_compute[189265]: 2025-09-30 07:24:40.567 2 INFO nova.scheduler.client.report [None req-a75b8f7b-26d7-4b0e-b623-c1fb852cff8a 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Deleted allocations for instance d3a690a6-3b2e-4b96-9e64-dd1beeb976cf
Sep 30 07:24:41 compute-0 nova_compute[189265]: 2025-09-30 07:24:41.640 2 DEBUG oslo_concurrency.lockutils [None req-a75b8f7b-26d7-4b0e-b623-c1fb852cff8a 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lock "d3a690a6-3b2e-4b96-9e64-dd1beeb976cf" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.635s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:24:41 compute-0 nova_compute[189265]: 2025-09-30 07:24:41.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:24:42 compute-0 nova_compute[189265]: 2025-09-30 07:24:42.309 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:24:42 compute-0 nova_compute[189265]: 2025-09-30 07:24:42.309 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:24:42 compute-0 nova_compute[189265]: 2025-09-30 07:24:42.310 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:24:42 compute-0 nova_compute[189265]: 2025-09-30 07:24:42.310 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:24:42 compute-0 nova_compute[189265]: 2025-09-30 07:24:42.509 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:24:42 compute-0 nova_compute[189265]: 2025-09-30 07:24:42.510 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:24:42 compute-0 nova_compute[189265]: 2025-09-30 07:24:42.540 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.030s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:24:42 compute-0 nova_compute[189265]: 2025-09-30 07:24:42.541 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5836MB free_disk=73.30397415161133GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:24:42 compute-0 nova_compute[189265]: 2025-09-30 07:24:42.542 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:24:42 compute-0 nova_compute[189265]: 2025-09-30 07:24:42.542 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:24:42 compute-0 nova_compute[189265]: 2025-09-30 07:24:42.612 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:42 compute-0 nova_compute[189265]: 2025-09-30 07:24:42.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:43 compute-0 nova_compute[189265]: 2025-09-30 07:24:43.634 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:24:43 compute-0 nova_compute[189265]: 2025-09-30 07:24:43.635 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:24:42 up  1:22,  0 user,  load average: 0.24, 0.23, 0.36\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:24:43 compute-0 nova_compute[189265]: 2025-09-30 07:24:43.696 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Refreshing inventories for resource provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Sep 30 07:24:43 compute-0 nova_compute[189265]: 2025-09-30 07:24:43.708 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Updating ProviderTree inventory for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Sep 30 07:24:43 compute-0 nova_compute[189265]: 2025-09-30 07:24:43.709 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Updating inventory in ProviderTree for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Sep 30 07:24:43 compute-0 nova_compute[189265]: 2025-09-30 07:24:43.723 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Refreshing aggregate associations for resource provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Sep 30 07:24:43 compute-0 nova_compute[189265]: 2025-09-30 07:24:43.740 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Refreshing trait associations for resource provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc, traits: COMPUTE_SECURITY_TPM_CRB,HW_ARCH_X86_64,HW_CPU_X86_F16C,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AESNI,COMPUTE_STORAGE_VIRTIO_FS,HW_CPU_X86_FMA3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE,COMPUTE_ACCELERATORS,HW_CPU_X86_SVM,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_EXTEND,COMPUTE_ARCH_X86_64,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SHA,HW_CPU_X86_BMI,COMPUTE_SOUND_MODEL_USB,COMPUTE_SOUND_MODEL_SB16,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AMD_SVM,HW_CPU_X86_BMI2,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_TIS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_AVX,COMPUTE_SOUND_MODEL_AC97,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_ABM,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_VIF_MODEL_IGB,HW_CPU_X86_SSE41,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SOUND_MODEL_ICH6,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_MMX,HW_CPU_X86_SSE4A,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_PCSPK,HW_CPU_X86_CLMUL _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Sep 30 07:24:43 compute-0 nova_compute[189265]: 2025-09-30 07:24:43.761 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:24:44 compute-0 nova_compute[189265]: 2025-09-30 07:24:44.268 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:24:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:24:44.503 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=01429670-4ea1-4dab-babc-4bc628cc01bb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:24:44 compute-0 nova_compute[189265]: 2025-09-30 07:24:44.778 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:24:44 compute-0 nova_compute[189265]: 2025-09-30 07:24:44.779 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.237s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:24:45 compute-0 nova_compute[189265]: 2025-09-30 07:24:45.780 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:24:47 compute-0 podman[216674]: 2025-09-30 07:24:47.5417501 +0000 UTC m=+0.122911203 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 07:24:47 compute-0 nova_compute[189265]: 2025-09-30 07:24:47.613 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:47 compute-0 nova_compute[189265]: 2025-09-30 07:24:47.746 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:47 compute-0 nova_compute[189265]: 2025-09-30 07:24:47.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:24:49 compute-0 nova_compute[189265]: 2025-09-30 07:24:49.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:24:52 compute-0 nova_compute[189265]: 2025-09-30 07:24:52.617 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:52 compute-0 nova_compute[189265]: 2025-09-30 07:24:52.748 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:56 compute-0 podman[216701]: 2025-09-30 07:24:56.520558307 +0000 UTC m=+0.088576841 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS)
Sep 30 07:24:57 compute-0 nova_compute[189265]: 2025-09-30 07:24:57.620 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:57 compute-0 nova_compute[189265]: 2025-09-30 07:24:57.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:24:59 compute-0 podman[216722]: 2025-09-30 07:24:59.479506071 +0000 UTC m=+0.070282751 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, version=9.6, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, managed_by=edpm_ansible, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm)
Sep 30 07:24:59 compute-0 podman[199733]: time="2025-09-30T07:24:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:24:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:24:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:24:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:24:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3011 "" "Go-http-client/1.1"
Sep 30 07:25:01 compute-0 openstack_network_exporter[201859]: ERROR   07:25:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:25:01 compute-0 openstack_network_exporter[201859]: ERROR   07:25:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:25:01 compute-0 openstack_network_exporter[201859]: ERROR   07:25:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:25:01 compute-0 openstack_network_exporter[201859]: ERROR   07:25:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:25:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:25:01 compute-0 openstack_network_exporter[201859]: ERROR   07:25:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:25:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:25:02 compute-0 nova_compute[189265]: 2025-09-30 07:25:02.622 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:25:02 compute-0 nova_compute[189265]: 2025-09-30 07:25:02.751 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:25:03 compute-0 podman[216744]: 2025-09-30 07:25:03.495700202 +0000 UTC m=+0.070260861 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Sep 30 07:25:03 compute-0 podman[216743]: 2025-09-30 07:25:03.50671701 +0000 UTC m=+0.081762273 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=multipathd)
Sep 30 07:25:03 compute-0 podman[216745]: 2025-09-30 07:25:03.555305614 +0000 UTC m=+0.122992134 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250930, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Sep 30 07:25:07 compute-0 nova_compute[189265]: 2025-09-30 07:25:07.624 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:25:07 compute-0 nova_compute[189265]: 2025-09-30 07:25:07.754 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:25:08 compute-0 ovn_controller[91436]: 2025-09-30T07:25:08Z|00131|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory
Sep 30 07:25:12 compute-0 nova_compute[189265]: 2025-09-30 07:25:12.627 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:25:12 compute-0 nova_compute[189265]: 2025-09-30 07:25:12.761 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:25:17 compute-0 nova_compute[189265]: 2025-09-30 07:25:17.629 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:25:17 compute-0 nova_compute[189265]: 2025-09-30 07:25:17.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:25:18 compute-0 podman[216803]: 2025-09-30 07:25:18.464135727 +0000 UTC m=+0.051744186 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 07:25:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:25:20.553 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:25:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:25:20.553 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:25:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:25:20.554 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:25:22 compute-0 nova_compute[189265]: 2025-09-30 07:25:22.632 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:25:22 compute-0 nova_compute[189265]: 2025-09-30 07:25:22.766 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:25:27 compute-0 podman[216828]: 2025-09-30 07:25:27.506181922 +0000 UTC m=+0.078884050 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 07:25:27 compute-0 nova_compute[189265]: 2025-09-30 07:25:27.634 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:25:27 compute-0 nova_compute[189265]: 2025-09-30 07:25:27.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:25:28 compute-0 nova_compute[189265]: 2025-09-30 07:25:28.120 2 DEBUG oslo_concurrency.lockutils [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Acquiring lock "de990aa4-7e40-416a-8e7b-ebf90847bb68" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:25:28 compute-0 nova_compute[189265]: 2025-09-30 07:25:28.121 2 DEBUG oslo_concurrency.lockutils [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lock "de990aa4-7e40-416a-8e7b-ebf90847bb68" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:25:28 compute-0 nova_compute[189265]: 2025-09-30 07:25:28.626 2 DEBUG nova.compute.manager [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Sep 30 07:25:29 compute-0 nova_compute[189265]: 2025-09-30 07:25:29.177 2 DEBUG oslo_concurrency.lockutils [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:25:29 compute-0 nova_compute[189265]: 2025-09-30 07:25:29.178 2 DEBUG oslo_concurrency.lockutils [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:25:29 compute-0 nova_compute[189265]: 2025-09-30 07:25:29.183 2 DEBUG nova.virt.hardware [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Sep 30 07:25:29 compute-0 nova_compute[189265]: 2025-09-30 07:25:29.184 2 INFO nova.compute.claims [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Claim successful on node compute-0.ctlplane.example.com
Sep 30 07:25:29 compute-0 podman[199733]: time="2025-09-30T07:25:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:25:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:25:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:25:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:25:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3006 "" "Go-http-client/1.1"
Sep 30 07:25:30 compute-0 nova_compute[189265]: 2025-09-30 07:25:30.263 2 DEBUG nova.compute.provider_tree [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:25:30 compute-0 podman[216849]: 2025-09-30 07:25:30.508232031 +0000 UTC m=+0.090129835 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, version=9.6, distribution-scope=public, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vendor=Red Hat, Inc., managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.buildah.version=1.33.7, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, config_id=edpm, container_name=openstack_network_exporter)
Sep 30 07:25:30 compute-0 nova_compute[189265]: 2025-09-30 07:25:30.775 2 DEBUG nova.scheduler.client.report [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:25:31 compute-0 nova_compute[189265]: 2025-09-30 07:25:31.288 2 DEBUG oslo_concurrency.lockutils [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.110s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:25:31 compute-0 nova_compute[189265]: 2025-09-30 07:25:31.289 2 DEBUG nova.compute.manager [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Sep 30 07:25:31 compute-0 openstack_network_exporter[201859]: ERROR   07:25:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:25:31 compute-0 openstack_network_exporter[201859]: ERROR   07:25:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:25:31 compute-0 openstack_network_exporter[201859]: ERROR   07:25:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:25:31 compute-0 openstack_network_exporter[201859]: ERROR   07:25:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:25:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:25:31 compute-0 openstack_network_exporter[201859]: ERROR   07:25:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:25:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:25:31 compute-0 nova_compute[189265]: 2025-09-30 07:25:31.805 2 DEBUG nova.compute.manager [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Sep 30 07:25:31 compute-0 nova_compute[189265]: 2025-09-30 07:25:31.806 2 DEBUG nova.network.neutron [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Sep 30 07:25:31 compute-0 nova_compute[189265]: 2025-09-30 07:25:31.806 2 WARNING neutronclient.v2_0.client [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:25:31 compute-0 nova_compute[189265]: 2025-09-30 07:25:31.807 2 WARNING neutronclient.v2_0.client [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:25:32 compute-0 nova_compute[189265]: 2025-09-30 07:25:32.317 2 INFO nova.virt.libvirt.driver [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 07:25:32 compute-0 nova_compute[189265]: 2025-09-30 07:25:32.635 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:25:32 compute-0 nova_compute[189265]: 2025-09-30 07:25:32.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:25:32 compute-0 nova_compute[189265]: 2025-09-30 07:25:32.844 2 DEBUG nova.compute.manager [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Sep 30 07:25:33 compute-0 nova_compute[189265]: 2025-09-30 07:25:33.709 2 DEBUG nova.network.neutron [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Successfully created port: a937ddac-5b07-4fc5-8b58-8cc93dee8cac _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Sep 30 07:25:33 compute-0 nova_compute[189265]: 2025-09-30 07:25:33.861 2 DEBUG nova.compute.manager [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Sep 30 07:25:33 compute-0 nova_compute[189265]: 2025-09-30 07:25:33.863 2 DEBUG nova.virt.libvirt.driver [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Sep 30 07:25:33 compute-0 nova_compute[189265]: 2025-09-30 07:25:33.864 2 INFO nova.virt.libvirt.driver [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Creating image(s)
Sep 30 07:25:33 compute-0 nova_compute[189265]: 2025-09-30 07:25:33.865 2 DEBUG oslo_concurrency.lockutils [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Acquiring lock "/var/lib/nova/instances/de990aa4-7e40-416a-8e7b-ebf90847bb68/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:25:33 compute-0 nova_compute[189265]: 2025-09-30 07:25:33.865 2 DEBUG oslo_concurrency.lockutils [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lock "/var/lib/nova/instances/de990aa4-7e40-416a-8e7b-ebf90847bb68/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:25:33 compute-0 nova_compute[189265]: 2025-09-30 07:25:33.866 2 DEBUG oslo_concurrency.lockutils [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lock "/var/lib/nova/instances/de990aa4-7e40-416a-8e7b-ebf90847bb68/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:25:33 compute-0 nova_compute[189265]: 2025-09-30 07:25:33.867 2 DEBUG oslo_utils.imageutils.format_inspector [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:25:33 compute-0 nova_compute[189265]: 2025-09-30 07:25:33.874 2 DEBUG oslo_utils.imageutils.format_inspector [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:25:33 compute-0 nova_compute[189265]: 2025-09-30 07:25:33.876 2 DEBUG oslo_concurrency.processutils [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:25:33 compute-0 nova_compute[189265]: 2025-09-30 07:25:33.963 2 DEBUG oslo_concurrency.processutils [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:25:33 compute-0 nova_compute[189265]: 2025-09-30 07:25:33.965 2 DEBUG oslo_concurrency.lockutils [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Acquiring lock "649c128805005f3dfb5a93843c58a367cdfe939d" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:25:33 compute-0 nova_compute[189265]: 2025-09-30 07:25:33.965 2 DEBUG oslo_concurrency.lockutils [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lock "649c128805005f3dfb5a93843c58a367cdfe939d" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:25:33 compute-0 nova_compute[189265]: 2025-09-30 07:25:33.966 2 DEBUG oslo_utils.imageutils.format_inspector [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:25:33 compute-0 nova_compute[189265]: 2025-09-30 07:25:33.970 2 DEBUG oslo_utils.imageutils.format_inspector [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:25:33 compute-0 nova_compute[189265]: 2025-09-30 07:25:33.970 2 DEBUG oslo_concurrency.processutils [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:25:34 compute-0 nova_compute[189265]: 2025-09-30 07:25:34.034 2 DEBUG oslo_concurrency.processutils [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:25:34 compute-0 nova_compute[189265]: 2025-09-30 07:25:34.035 2 DEBUG oslo_concurrency.processutils [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d,backing_fmt=raw /var/lib/nova/instances/de990aa4-7e40-416a-8e7b-ebf90847bb68/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:25:34 compute-0 nova_compute[189265]: 2025-09-30 07:25:34.063 2 DEBUG oslo_concurrency.processutils [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d,backing_fmt=raw /var/lib/nova/instances/de990aa4-7e40-416a-8e7b-ebf90847bb68/disk 1073741824" returned: 0 in 0.028s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:25:34 compute-0 nova_compute[189265]: 2025-09-30 07:25:34.064 2 DEBUG oslo_concurrency.lockutils [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lock "649c128805005f3dfb5a93843c58a367cdfe939d" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.098s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:25:34 compute-0 nova_compute[189265]: 2025-09-30 07:25:34.064 2 DEBUG oslo_concurrency.processutils [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:25:34 compute-0 nova_compute[189265]: 2025-09-30 07:25:34.124 2 DEBUG oslo_concurrency.processutils [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:25:34 compute-0 nova_compute[189265]: 2025-09-30 07:25:34.126 2 DEBUG nova.virt.disk.api [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Checking if we can resize image /var/lib/nova/instances/de990aa4-7e40-416a-8e7b-ebf90847bb68/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Sep 30 07:25:34 compute-0 nova_compute[189265]: 2025-09-30 07:25:34.127 2 DEBUG oslo_concurrency.processutils [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/de990aa4-7e40-416a-8e7b-ebf90847bb68/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:25:34 compute-0 nova_compute[189265]: 2025-09-30 07:25:34.177 2 DEBUG oslo_concurrency.processutils [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/de990aa4-7e40-416a-8e7b-ebf90847bb68/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:25:34 compute-0 nova_compute[189265]: 2025-09-30 07:25:34.178 2 DEBUG nova.virt.disk.api [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Cannot resize image /var/lib/nova/instances/de990aa4-7e40-416a-8e7b-ebf90847bb68/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Sep 30 07:25:34 compute-0 nova_compute[189265]: 2025-09-30 07:25:34.179 2 DEBUG nova.virt.libvirt.driver [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Sep 30 07:25:34 compute-0 nova_compute[189265]: 2025-09-30 07:25:34.180 2 DEBUG nova.virt.libvirt.driver [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Ensure instance console log exists: /var/lib/nova/instances/de990aa4-7e40-416a-8e7b-ebf90847bb68/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 07:25:34 compute-0 nova_compute[189265]: 2025-09-30 07:25:34.180 2 DEBUG oslo_concurrency.lockutils [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:25:34 compute-0 nova_compute[189265]: 2025-09-30 07:25:34.181 2 DEBUG oslo_concurrency.lockutils [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:25:34 compute-0 nova_compute[189265]: 2025-09-30 07:25:34.182 2 DEBUG oslo_concurrency.lockutils [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:25:34 compute-0 podman[216886]: 2025-09-30 07:25:34.512699772 +0000 UTC m=+0.076067379 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Sep 30 07:25:34 compute-0 podman[216885]: 2025-09-30 07:25:34.552694798 +0000 UTC m=+0.120335948 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Sep 30 07:25:34 compute-0 podman[216887]: 2025-09-30 07:25:34.571034378 +0000 UTC m=+0.129161093 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Sep 30 07:25:34 compute-0 nova_compute[189265]: 2025-09-30 07:25:34.720 2 DEBUG nova.network.neutron [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Successfully updated port: a937ddac-5b07-4fc5-8b58-8cc93dee8cac _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Sep 30 07:25:34 compute-0 nova_compute[189265]: 2025-09-30 07:25:34.788 2 DEBUG nova.compute.manager [req-d78de553-341e-44e5-bc93-244f51850cae req-28d5adb4-1eac-4641-83ba-911bd5faeccd 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Received event network-changed-a937ddac-5b07-4fc5-8b58-8cc93dee8cac external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:25:34 compute-0 nova_compute[189265]: 2025-09-30 07:25:34.788 2 DEBUG nova.compute.manager [req-d78de553-341e-44e5-bc93-244f51850cae req-28d5adb4-1eac-4641-83ba-911bd5faeccd 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Refreshing instance network info cache due to event network-changed-a937ddac-5b07-4fc5-8b58-8cc93dee8cac. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 07:25:34 compute-0 nova_compute[189265]: 2025-09-30 07:25:34.789 2 DEBUG oslo_concurrency.lockutils [req-d78de553-341e-44e5-bc93-244f51850cae req-28d5adb4-1eac-4641-83ba-911bd5faeccd 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "refresh_cache-de990aa4-7e40-416a-8e7b-ebf90847bb68" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:25:34 compute-0 nova_compute[189265]: 2025-09-30 07:25:34.789 2 DEBUG oslo_concurrency.lockutils [req-d78de553-341e-44e5-bc93-244f51850cae req-28d5adb4-1eac-4641-83ba-911bd5faeccd 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquired lock "refresh_cache-de990aa4-7e40-416a-8e7b-ebf90847bb68" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:25:34 compute-0 nova_compute[189265]: 2025-09-30 07:25:34.790 2 DEBUG nova.network.neutron [req-d78de553-341e-44e5-bc93-244f51850cae req-28d5adb4-1eac-4641-83ba-911bd5faeccd 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Refreshing network info cache for port a937ddac-5b07-4fc5-8b58-8cc93dee8cac _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 07:25:35 compute-0 nova_compute[189265]: 2025-09-30 07:25:35.231 2 DEBUG oslo_concurrency.lockutils [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Acquiring lock "refresh_cache-de990aa4-7e40-416a-8e7b-ebf90847bb68" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:25:35 compute-0 nova_compute[189265]: 2025-09-30 07:25:35.297 2 WARNING neutronclient.v2_0.client [req-d78de553-341e-44e5-bc93-244f51850cae req-28d5adb4-1eac-4641-83ba-911bd5faeccd 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:25:35 compute-0 nova_compute[189265]: 2025-09-30 07:25:35.540 2 DEBUG nova.network.neutron [req-d78de553-341e-44e5-bc93-244f51850cae req-28d5adb4-1eac-4641-83ba-911bd5faeccd 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 07:25:35 compute-0 nova_compute[189265]: 2025-09-30 07:25:35.748 2 DEBUG nova.network.neutron [req-d78de553-341e-44e5-bc93-244f51850cae req-28d5adb4-1eac-4641-83ba-911bd5faeccd 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:25:36 compute-0 nova_compute[189265]: 2025-09-30 07:25:36.258 2 DEBUG oslo_concurrency.lockutils [req-d78de553-341e-44e5-bc93-244f51850cae req-28d5adb4-1eac-4641-83ba-911bd5faeccd 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Releasing lock "refresh_cache-de990aa4-7e40-416a-8e7b-ebf90847bb68" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:25:36 compute-0 nova_compute[189265]: 2025-09-30 07:25:36.260 2 DEBUG oslo_concurrency.lockutils [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Acquired lock "refresh_cache-de990aa4-7e40-416a-8e7b-ebf90847bb68" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:25:36 compute-0 nova_compute[189265]: 2025-09-30 07:25:36.260 2 DEBUG nova.network.neutron [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 07:25:36 compute-0 nova_compute[189265]: 2025-09-30 07:25:36.783 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:25:37 compute-0 nova_compute[189265]: 2025-09-30 07:25:37.542 2 DEBUG nova.network.neutron [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 07:25:37 compute-0 nova_compute[189265]: 2025-09-30 07:25:37.638 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:25:37 compute-0 nova_compute[189265]: 2025-09-30 07:25:37.770 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:25:37 compute-0 nova_compute[189265]: 2025-09-30 07:25:37.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:25:38 compute-0 nova_compute[189265]: 2025-09-30 07:25:38.502 2 WARNING neutronclient.v2_0.client [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:25:38 compute-0 nova_compute[189265]: 2025-09-30 07:25:38.638 2 DEBUG nova.network.neutron [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Updating instance_info_cache with network_info: [{"id": "a937ddac-5b07-4fc5-8b58-8cc93dee8cac", "address": "fa:16:3e:f7:c8:c7", "network": {"id": "0a07ba3d-468f-4279-9be2-b3ef141df6a7", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-465825729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02a4831cb362481d98b354ed3bf2d113", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa937ddac-5b", "ovs_interfaceid": "a937ddac-5b07-4fc5-8b58-8cc93dee8cac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:25:39 compute-0 nova_compute[189265]: 2025-09-30 07:25:39.145 2 DEBUG oslo_concurrency.lockutils [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Releasing lock "refresh_cache-de990aa4-7e40-416a-8e7b-ebf90847bb68" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:25:39 compute-0 nova_compute[189265]: 2025-09-30 07:25:39.146 2 DEBUG nova.compute.manager [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Instance network_info: |[{"id": "a937ddac-5b07-4fc5-8b58-8cc93dee8cac", "address": "fa:16:3e:f7:c8:c7", "network": {"id": "0a07ba3d-468f-4279-9be2-b3ef141df6a7", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-465825729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02a4831cb362481d98b354ed3bf2d113", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa937ddac-5b", "ovs_interfaceid": "a937ddac-5b07-4fc5-8b58-8cc93dee8cac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Sep 30 07:25:39 compute-0 nova_compute[189265]: 2025-09-30 07:25:39.150 2 DEBUG nova.virt.libvirt.driver [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Start _get_guest_xml network_info=[{"id": "a937ddac-5b07-4fc5-8b58-8cc93dee8cac", "address": "fa:16:3e:f7:c8:c7", "network": {"id": "0a07ba3d-468f-4279-9be2-b3ef141df6a7", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-465825729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02a4831cb362481d98b354ed3bf2d113", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa937ddac-5b", "ovs_interfaceid": "a937ddac-5b07-4fc5-8b58-8cc93dee8cac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T07:07:59Z,direct_url=<?>,disk_format='qcow2',id=0c6b92f5-9861-49e4-862d-3ffd84520dfa,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4049964ce8244dacb50493f6676c6613',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T07:08:00Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'encryption_format': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encrypted': False, 'image_id': '0c6b92f5-9861-49e4-862d-3ffd84520dfa'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Sep 30 07:25:39 compute-0 nova_compute[189265]: 2025-09-30 07:25:39.156 2 WARNING nova.virt.libvirt.driver [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:25:39 compute-0 nova_compute[189265]: 2025-09-30 07:25:39.158 2 DEBUG nova.virt.driver [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='0c6b92f5-9861-49e4-862d-3ffd84520dfa', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteHostMaintenanceStrategy-server-1592299149', uuid='de990aa4-7e40-416a-8e7b-ebf90847bb68'), owner=OwnerMeta(userid='071bf5838f2f473a865873b6f7846f84', username='tempest-TestExecuteHostMaintenanceStrategy-385408215-project-admin', projectid='2ad7bd988b6047509c2c19eb4e0dc32c', projectname='tempest-TestExecuteHostMaintenanceStrategy-385408215'), image=ImageMeta(id='0c6b92f5-9861-49e4-862d-3ffd84520dfa', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='ded17455-f8fe-40c7-8dae-6f0a2b208ae0', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "a937ddac-5b07-4fc5-8b58-8cc93dee8cac", "address": "fa:16:3e:f7:c8:c7", "network": {"id": "0a07ba3d-468f-4279-9be2-b3ef141df6a7", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-465825729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02a4831cb362481d98b354ed3bf2d113", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa937ddac-5b", "ovs_interfaceid": "a937ddac-5b07-4fc5-8b58-8cc93dee8cac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759217139.1579669) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Sep 30 07:25:39 compute-0 nova_compute[189265]: 2025-09-30 07:25:39.164 2 DEBUG nova.virt.libvirt.host [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Sep 30 07:25:39 compute-0 nova_compute[189265]: 2025-09-30 07:25:39.164 2 DEBUG nova.virt.libvirt.host [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Sep 30 07:25:39 compute-0 nova_compute[189265]: 2025-09-30 07:25:39.169 2 DEBUG nova.virt.libvirt.host [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Sep 30 07:25:39 compute-0 nova_compute[189265]: 2025-09-30 07:25:39.170 2 DEBUG nova.virt.libvirt.host [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Sep 30 07:25:39 compute-0 nova_compute[189265]: 2025-09-30 07:25:39.171 2 DEBUG nova.virt.libvirt.driver [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 07:25:39 compute-0 nova_compute[189265]: 2025-09-30 07:25:39.171 2 DEBUG nova.virt.hardware [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T07:07:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='ded17455-f8fe-40c7-8dae-6f0a2b208ae0',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T07:07:59Z,direct_url=<?>,disk_format='qcow2',id=0c6b92f5-9861-49e4-862d-3ffd84520dfa,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4049964ce8244dacb50493f6676c6613',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T07:08:00Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Sep 30 07:25:39 compute-0 nova_compute[189265]: 2025-09-30 07:25:39.172 2 DEBUG nova.virt.hardware [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Sep 30 07:25:39 compute-0 nova_compute[189265]: 2025-09-30 07:25:39.173 2 DEBUG nova.virt.hardware [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Sep 30 07:25:39 compute-0 nova_compute[189265]: 2025-09-30 07:25:39.174 2 DEBUG nova.virt.hardware [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Sep 30 07:25:39 compute-0 nova_compute[189265]: 2025-09-30 07:25:39.174 2 DEBUG nova.virt.hardware [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Sep 30 07:25:39 compute-0 nova_compute[189265]: 2025-09-30 07:25:39.175 2 DEBUG nova.virt.hardware [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Sep 30 07:25:39 compute-0 nova_compute[189265]: 2025-09-30 07:25:39.175 2 DEBUG nova.virt.hardware [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Sep 30 07:25:39 compute-0 nova_compute[189265]: 2025-09-30 07:25:39.176 2 DEBUG nova.virt.hardware [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Sep 30 07:25:39 compute-0 nova_compute[189265]: 2025-09-30 07:25:39.176 2 DEBUG nova.virt.hardware [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Sep 30 07:25:39 compute-0 nova_compute[189265]: 2025-09-30 07:25:39.177 2 DEBUG nova.virt.hardware [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Sep 30 07:25:39 compute-0 nova_compute[189265]: 2025-09-30 07:25:39.177 2 DEBUG nova.virt.hardware [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Sep 30 07:25:39 compute-0 nova_compute[189265]: 2025-09-30 07:25:39.186 2 DEBUG nova.virt.libvirt.vif [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T07:25:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1592299149',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1592299149',id=13,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2ad7bd988b6047509c2c19eb4e0dc32c',ramdisk_id='',reservation_id='r-2fx9o6oa',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-385408215',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-385408215-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T07:25:32Z,user_data=None,user_id='071bf5838f2f473a865873b6f7846f84',uuid=de990aa4-7e40-416a-8e7b-ebf90847bb68,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a937ddac-5b07-4fc5-8b58-8cc93dee8cac", "address": "fa:16:3e:f7:c8:c7", "network": {"id": "0a07ba3d-468f-4279-9be2-b3ef141df6a7", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-465825729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02a4831cb362481d98b354ed3bf2d113", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa937ddac-5b", "ovs_interfaceid": "a937ddac-5b07-4fc5-8b58-8cc93dee8cac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 07:25:39 compute-0 nova_compute[189265]: 2025-09-30 07:25:39.187 2 DEBUG nova.network.os_vif_util [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Converting VIF {"id": "a937ddac-5b07-4fc5-8b58-8cc93dee8cac", "address": "fa:16:3e:f7:c8:c7", "network": {"id": "0a07ba3d-468f-4279-9be2-b3ef141df6a7", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-465825729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02a4831cb362481d98b354ed3bf2d113", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa937ddac-5b", "ovs_interfaceid": "a937ddac-5b07-4fc5-8b58-8cc93dee8cac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:25:39 compute-0 nova_compute[189265]: 2025-09-30 07:25:39.188 2 DEBUG nova.network.os_vif_util [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f7:c8:c7,bridge_name='br-int',has_traffic_filtering=True,id=a937ddac-5b07-4fc5-8b58-8cc93dee8cac,network=Network(0a07ba3d-468f-4279-9be2-b3ef141df6a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa937ddac-5b') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:25:39 compute-0 nova_compute[189265]: 2025-09-30 07:25:39.189 2 DEBUG nova.objects.instance [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lazy-loading 'pci_devices' on Instance uuid de990aa4-7e40-416a-8e7b-ebf90847bb68 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:25:39 compute-0 nova_compute[189265]: 2025-09-30 07:25:39.700 2 DEBUG nova.virt.libvirt.driver [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] End _get_guest_xml xml=<domain type="kvm">
Sep 30 07:25:39 compute-0 nova_compute[189265]:   <uuid>de990aa4-7e40-416a-8e7b-ebf90847bb68</uuid>
Sep 30 07:25:39 compute-0 nova_compute[189265]:   <name>instance-0000000d</name>
Sep 30 07:25:39 compute-0 nova_compute[189265]:   <memory>131072</memory>
Sep 30 07:25:39 compute-0 nova_compute[189265]:   <vcpu>1</vcpu>
Sep 30 07:25:39 compute-0 nova_compute[189265]:   <metadata>
Sep 30 07:25:39 compute-0 nova_compute[189265]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 07:25:39 compute-0 nova_compute[189265]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:       <nova:name>tempest-TestExecuteHostMaintenanceStrategy-server-1592299149</nova:name>
Sep 30 07:25:39 compute-0 nova_compute[189265]:       <nova:creationTime>2025-09-30 07:25:39</nova:creationTime>
Sep 30 07:25:39 compute-0 nova_compute[189265]:       <nova:flavor name="m1.nano" id="ded17455-f8fe-40c7-8dae-6f0a2b208ae0">
Sep 30 07:25:39 compute-0 nova_compute[189265]:         <nova:memory>128</nova:memory>
Sep 30 07:25:39 compute-0 nova_compute[189265]:         <nova:disk>1</nova:disk>
Sep 30 07:25:39 compute-0 nova_compute[189265]:         <nova:swap>0</nova:swap>
Sep 30 07:25:39 compute-0 nova_compute[189265]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 07:25:39 compute-0 nova_compute[189265]:         <nova:vcpus>1</nova:vcpus>
Sep 30 07:25:39 compute-0 nova_compute[189265]:         <nova:extraSpecs>
Sep 30 07:25:39 compute-0 nova_compute[189265]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 07:25:39 compute-0 nova_compute[189265]:         </nova:extraSpecs>
Sep 30 07:25:39 compute-0 nova_compute[189265]:       </nova:flavor>
Sep 30 07:25:39 compute-0 nova_compute[189265]:       <nova:image uuid="0c6b92f5-9861-49e4-862d-3ffd84520dfa">
Sep 30 07:25:39 compute-0 nova_compute[189265]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 07:25:39 compute-0 nova_compute[189265]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 07:25:39 compute-0 nova_compute[189265]:         <nova:minDisk>1</nova:minDisk>
Sep 30 07:25:39 compute-0 nova_compute[189265]:         <nova:minRam>0</nova:minRam>
Sep 30 07:25:39 compute-0 nova_compute[189265]:         <nova:properties>
Sep 30 07:25:39 compute-0 nova_compute[189265]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 07:25:39 compute-0 nova_compute[189265]:         </nova:properties>
Sep 30 07:25:39 compute-0 nova_compute[189265]:       </nova:image>
Sep 30 07:25:39 compute-0 nova_compute[189265]:       <nova:owner>
Sep 30 07:25:39 compute-0 nova_compute[189265]:         <nova:user uuid="071bf5838f2f473a865873b6f7846f84">tempest-TestExecuteHostMaintenanceStrategy-385408215-project-admin</nova:user>
Sep 30 07:25:39 compute-0 nova_compute[189265]:         <nova:project uuid="2ad7bd988b6047509c2c19eb4e0dc32c">tempest-TestExecuteHostMaintenanceStrategy-385408215</nova:project>
Sep 30 07:25:39 compute-0 nova_compute[189265]:       </nova:owner>
Sep 30 07:25:39 compute-0 nova_compute[189265]:       <nova:root type="image" uuid="0c6b92f5-9861-49e4-862d-3ffd84520dfa"/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:       <nova:ports>
Sep 30 07:25:39 compute-0 nova_compute[189265]:         <nova:port uuid="a937ddac-5b07-4fc5-8b58-8cc93dee8cac">
Sep 30 07:25:39 compute-0 nova_compute[189265]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:         </nova:port>
Sep 30 07:25:39 compute-0 nova_compute[189265]:       </nova:ports>
Sep 30 07:25:39 compute-0 nova_compute[189265]:     </nova:instance>
Sep 30 07:25:39 compute-0 nova_compute[189265]:   </metadata>
Sep 30 07:25:39 compute-0 nova_compute[189265]:   <sysinfo type="smbios">
Sep 30 07:25:39 compute-0 nova_compute[189265]:     <system>
Sep 30 07:25:39 compute-0 nova_compute[189265]:       <entry name="manufacturer">RDO</entry>
Sep 30 07:25:39 compute-0 nova_compute[189265]:       <entry name="product">OpenStack Compute</entry>
Sep 30 07:25:39 compute-0 nova_compute[189265]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 07:25:39 compute-0 nova_compute[189265]:       <entry name="serial">de990aa4-7e40-416a-8e7b-ebf90847bb68</entry>
Sep 30 07:25:39 compute-0 nova_compute[189265]:       <entry name="uuid">de990aa4-7e40-416a-8e7b-ebf90847bb68</entry>
Sep 30 07:25:39 compute-0 nova_compute[189265]:       <entry name="family">Virtual Machine</entry>
Sep 30 07:25:39 compute-0 nova_compute[189265]:     </system>
Sep 30 07:25:39 compute-0 nova_compute[189265]:   </sysinfo>
Sep 30 07:25:39 compute-0 nova_compute[189265]:   <os>
Sep 30 07:25:39 compute-0 nova_compute[189265]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 07:25:39 compute-0 nova_compute[189265]:     <boot dev="hd"/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:     <smbios mode="sysinfo"/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:   </os>
Sep 30 07:25:39 compute-0 nova_compute[189265]:   <features>
Sep 30 07:25:39 compute-0 nova_compute[189265]:     <acpi/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:     <apic/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:     <vmcoreinfo/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:   </features>
Sep 30 07:25:39 compute-0 nova_compute[189265]:   <clock offset="utc">
Sep 30 07:25:39 compute-0 nova_compute[189265]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:     <timer name="hpet" present="no"/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:   </clock>
Sep 30 07:25:39 compute-0 nova_compute[189265]:   <cpu mode="host-model" match="exact">
Sep 30 07:25:39 compute-0 nova_compute[189265]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:   </cpu>
Sep 30 07:25:39 compute-0 nova_compute[189265]:   <devices>
Sep 30 07:25:39 compute-0 nova_compute[189265]:     <disk type="file" device="disk">
Sep 30 07:25:39 compute-0 nova_compute[189265]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:       <source file="/var/lib/nova/instances/de990aa4-7e40-416a-8e7b-ebf90847bb68/disk"/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:       <target dev="vda" bus="virtio"/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:     </disk>
Sep 30 07:25:39 compute-0 nova_compute[189265]:     <disk type="file" device="cdrom">
Sep 30 07:25:39 compute-0 nova_compute[189265]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:       <source file="/var/lib/nova/instances/de990aa4-7e40-416a-8e7b-ebf90847bb68/disk.config"/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:       <target dev="sda" bus="sata"/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:     </disk>
Sep 30 07:25:39 compute-0 nova_compute[189265]:     <interface type="ethernet">
Sep 30 07:25:39 compute-0 nova_compute[189265]:       <mac address="fa:16:3e:f7:c8:c7"/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:       <model type="virtio"/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:       <mtu size="1442"/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:       <target dev="tapa937ddac-5b"/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:     </interface>
Sep 30 07:25:39 compute-0 nova_compute[189265]:     <serial type="pty">
Sep 30 07:25:39 compute-0 nova_compute[189265]:       <log file="/var/lib/nova/instances/de990aa4-7e40-416a-8e7b-ebf90847bb68/console.log" append="off"/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:     </serial>
Sep 30 07:25:39 compute-0 nova_compute[189265]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:     <video>
Sep 30 07:25:39 compute-0 nova_compute[189265]:       <model type="virtio"/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:     </video>
Sep 30 07:25:39 compute-0 nova_compute[189265]:     <input type="tablet" bus="usb"/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:     <rng model="virtio">
Sep 30 07:25:39 compute-0 nova_compute[189265]:       <backend model="random">/dev/urandom</backend>
Sep 30 07:25:39 compute-0 nova_compute[189265]:     </rng>
Sep 30 07:25:39 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root"/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:     <controller type="usb" index="0"/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 07:25:39 compute-0 nova_compute[189265]:       <stats period="10"/>
Sep 30 07:25:39 compute-0 nova_compute[189265]:     </memballoon>
Sep 30 07:25:39 compute-0 nova_compute[189265]:   </devices>
Sep 30 07:25:39 compute-0 nova_compute[189265]: </domain>
Sep 30 07:25:39 compute-0 nova_compute[189265]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Sep 30 07:25:39 compute-0 nova_compute[189265]: 2025-09-30 07:25:39.701 2 DEBUG nova.compute.manager [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Preparing to wait for external event network-vif-plugged-a937ddac-5b07-4fc5-8b58-8cc93dee8cac prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 07:25:39 compute-0 nova_compute[189265]: 2025-09-30 07:25:39.702 2 DEBUG oslo_concurrency.lockutils [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Acquiring lock "de990aa4-7e40-416a-8e7b-ebf90847bb68-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:25:39 compute-0 nova_compute[189265]: 2025-09-30 07:25:39.702 2 DEBUG oslo_concurrency.lockutils [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lock "de990aa4-7e40-416a-8e7b-ebf90847bb68-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:25:39 compute-0 nova_compute[189265]: 2025-09-30 07:25:39.703 2 DEBUG oslo_concurrency.lockutils [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lock "de990aa4-7e40-416a-8e7b-ebf90847bb68-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:25:39 compute-0 nova_compute[189265]: 2025-09-30 07:25:39.704 2 DEBUG nova.virt.libvirt.vif [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T07:25:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1592299149',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1592299149',id=13,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2ad7bd988b6047509c2c19eb4e0dc32c',ramdisk_id='',reservation_id='r-2fx9o6oa',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-385408215',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-385408215-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T07:25:32Z,user_data=None,user_id='071bf5838f2f473a865873b6f7846f84',uuid=de990aa4-7e40-416a-8e7b-ebf90847bb68,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a937ddac-5b07-4fc5-8b58-8cc93dee8cac", "address": "fa:16:3e:f7:c8:c7", "network": {"id": "0a07ba3d-468f-4279-9be2-b3ef141df6a7", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-465825729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02a4831cb362481d98b354ed3bf2d113", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa937ddac-5b", "ovs_interfaceid": "a937ddac-5b07-4fc5-8b58-8cc93dee8cac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 07:25:39 compute-0 nova_compute[189265]: 2025-09-30 07:25:39.704 2 DEBUG nova.network.os_vif_util [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Converting VIF {"id": "a937ddac-5b07-4fc5-8b58-8cc93dee8cac", "address": "fa:16:3e:f7:c8:c7", "network": {"id": "0a07ba3d-468f-4279-9be2-b3ef141df6a7", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-465825729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02a4831cb362481d98b354ed3bf2d113", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa937ddac-5b", "ovs_interfaceid": "a937ddac-5b07-4fc5-8b58-8cc93dee8cac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:25:39 compute-0 nova_compute[189265]: 2025-09-30 07:25:39.705 2 DEBUG nova.network.os_vif_util [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f7:c8:c7,bridge_name='br-int',has_traffic_filtering=True,id=a937ddac-5b07-4fc5-8b58-8cc93dee8cac,network=Network(0a07ba3d-468f-4279-9be2-b3ef141df6a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa937ddac-5b') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:25:39 compute-0 nova_compute[189265]: 2025-09-30 07:25:39.705 2 DEBUG os_vif [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f7:c8:c7,bridge_name='br-int',has_traffic_filtering=True,id=a937ddac-5b07-4fc5-8b58-8cc93dee8cac,network=Network(0a07ba3d-468f-4279-9be2-b3ef141df6a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa937ddac-5b') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 07:25:39 compute-0 nova_compute[189265]: 2025-09-30 07:25:39.706 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:25:39 compute-0 nova_compute[189265]: 2025-09-30 07:25:39.707 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:25:39 compute-0 nova_compute[189265]: 2025-09-30 07:25:39.707 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:25:39 compute-0 nova_compute[189265]: 2025-09-30 07:25:39.708 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:25:39 compute-0 nova_compute[189265]: 2025-09-30 07:25:39.708 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '4cd6cd75-6e8e-55ca-80d0-34a6c949643e', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:25:39 compute-0 nova_compute[189265]: 2025-09-30 07:25:39.748 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:25:39 compute-0 nova_compute[189265]: 2025-09-30 07:25:39.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:25:39 compute-0 nova_compute[189265]: 2025-09-30 07:25:39.753 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:25:39 compute-0 nova_compute[189265]: 2025-09-30 07:25:39.754 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa937ddac-5b, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:25:39 compute-0 nova_compute[189265]: 2025-09-30 07:25:39.754 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapa937ddac-5b, col_values=(('qos', UUID('f2d9a9fb-b44c-448c-ab14-8d084cb25d44')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:25:39 compute-0 nova_compute[189265]: 2025-09-30 07:25:39.755 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapa937ddac-5b, col_values=(('external_ids', {'iface-id': 'a937ddac-5b07-4fc5-8b58-8cc93dee8cac', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f7:c8:c7', 'vm-uuid': 'de990aa4-7e40-416a-8e7b-ebf90847bb68'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:25:39 compute-0 nova_compute[189265]: 2025-09-30 07:25:39.756 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:25:39 compute-0 NetworkManager[51813]: <info>  [1759217139.7571] manager: (tapa937ddac-5b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/47)
Sep 30 07:25:39 compute-0 nova_compute[189265]: 2025-09-30 07:25:39.759 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:25:39 compute-0 nova_compute[189265]: 2025-09-30 07:25:39.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:25:39 compute-0 nova_compute[189265]: 2025-09-30 07:25:39.765 2 INFO os_vif [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f7:c8:c7,bridge_name='br-int',has_traffic_filtering=True,id=a937ddac-5b07-4fc5-8b58-8cc93dee8cac,network=Network(0a07ba3d-468f-4279-9be2-b3ef141df6a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa937ddac-5b')
Sep 30 07:25:39 compute-0 nova_compute[189265]: 2025-09-30 07:25:39.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:25:39 compute-0 nova_compute[189265]: 2025-09-30 07:25:39.789 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:25:39 compute-0 nova_compute[189265]: 2025-09-30 07:25:39.789 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:25:41 compute-0 nova_compute[189265]: 2025-09-30 07:25:41.342 2 DEBUG nova.virt.libvirt.driver [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 07:25:41 compute-0 nova_compute[189265]: 2025-09-30 07:25:41.342 2 DEBUG nova.virt.libvirt.driver [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 07:25:41 compute-0 nova_compute[189265]: 2025-09-30 07:25:41.343 2 DEBUG nova.virt.libvirt.driver [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] No VIF found with MAC fa:16:3e:f7:c8:c7, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 07:25:41 compute-0 nova_compute[189265]: 2025-09-30 07:25:41.343 2 INFO nova.virt.libvirt.driver [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Using config drive
Sep 30 07:25:41 compute-0 nova_compute[189265]: 2025-09-30 07:25:41.856 2 WARNING neutronclient.v2_0.client [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:25:42 compute-0 nova_compute[189265]: 2025-09-30 07:25:42.649 2 INFO nova.virt.libvirt.driver [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Creating config drive at /var/lib/nova/instances/de990aa4-7e40-416a-8e7b-ebf90847bb68/disk.config
Sep 30 07:25:42 compute-0 nova_compute[189265]: 2025-09-30 07:25:42.658 2 DEBUG oslo_concurrency.processutils [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/de990aa4-7e40-416a-8e7b-ebf90847bb68/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpgjs0hc2b execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:25:42 compute-0 nova_compute[189265]: 2025-09-30 07:25:42.774 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:25:42 compute-0 nova_compute[189265]: 2025-09-30 07:25:42.784 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:25:42 compute-0 nova_compute[189265]: 2025-09-30 07:25:42.800 2 DEBUG oslo_concurrency.processutils [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/de990aa4-7e40-416a-8e7b-ebf90847bb68/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpgjs0hc2b" returned: 0 in 0.142s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:25:42 compute-0 kernel: tapa937ddac-5b: entered promiscuous mode
Sep 30 07:25:42 compute-0 NetworkManager[51813]: <info>  [1759217142.8867] manager: (tapa937ddac-5b): new Tun device (/org/freedesktop/NetworkManager/Devices/48)
Sep 30 07:25:42 compute-0 ovn_controller[91436]: 2025-09-30T07:25:42Z|00132|binding|INFO|Claiming lport a937ddac-5b07-4fc5-8b58-8cc93dee8cac for this chassis.
Sep 30 07:25:42 compute-0 ovn_controller[91436]: 2025-09-30T07:25:42Z|00133|binding|INFO|a937ddac-5b07-4fc5-8b58-8cc93dee8cac: Claiming fa:16:3e:f7:c8:c7 10.100.0.5
Sep 30 07:25:42 compute-0 nova_compute[189265]: 2025-09-30 07:25:42.889 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:25:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:25:42.894 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f7:c8:c7 10.100.0.5'], port_security=['fa:16:3e:f7:c8:c7 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'de990aa4-7e40-416a-8e7b-ebf90847bb68', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0a07ba3d-468f-4279-9be2-b3ef141df6a7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2ad7bd988b6047509c2c19eb4e0dc32c', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'dc82e88d-abda-4feb-bd34-afbed64798c7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ea21a402-508c-472e-bd89-e4a2e8cde5bb, chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], logical_port=a937ddac-5b07-4fc5-8b58-8cc93dee8cac) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:25:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:25:42.895 100322 INFO neutron.agent.ovn.metadata.agent [-] Port a937ddac-5b07-4fc5-8b58-8cc93dee8cac in datapath 0a07ba3d-468f-4279-9be2-b3ef141df6a7 bound to our chassis
Sep 30 07:25:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:25:42.897 100322 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0a07ba3d-468f-4279-9be2-b3ef141df6a7
Sep 30 07:25:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:25:42.908 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[81253803-1be0-4fc6-8458-230d60444975]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:25:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:25:42.908 100322 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0a07ba3d-41 in ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7 namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Sep 30 07:25:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:25:42.912 210650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0a07ba3d-40 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Sep 30 07:25:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:25:42.912 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[cf099922-3f70-4dc9-a256-764e86dbcaa5]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:25:42 compute-0 ovn_controller[91436]: 2025-09-30T07:25:42Z|00134|binding|INFO|Setting lport a937ddac-5b07-4fc5-8b58-8cc93dee8cac ovn-installed in OVS
Sep 30 07:25:42 compute-0 ovn_controller[91436]: 2025-09-30T07:25:42Z|00135|binding|INFO|Setting lport a937ddac-5b07-4fc5-8b58-8cc93dee8cac up in Southbound
Sep 30 07:25:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:25:42.913 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[70a2b308-5a98-45ae-b135-ec31dd24c336]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:25:42 compute-0 nova_compute[189265]: 2025-09-30 07:25:42.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:25:42 compute-0 nova_compute[189265]: 2025-09-30 07:25:42.924 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:25:42 compute-0 systemd-udevd[216970]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 07:25:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:25:42.934 100440 DEBUG oslo.privsep.daemon [-] privsep: reply[0ae251b3-36ac-4b53-9bf5-6c0e235de22f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:25:42 compute-0 systemd-machined[149233]: New machine qemu-9-instance-0000000d.
Sep 30 07:25:42 compute-0 NetworkManager[51813]: <info>  [1759217142.9487] device (tapa937ddac-5b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 07:25:42 compute-0 NetworkManager[51813]: <info>  [1759217142.9495] device (tapa937ddac-5b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 07:25:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:25:42.952 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[587cf52d-75ed-44dd-a7b4-231068bf60db]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:25:42 compute-0 systemd[1]: Started Virtual Machine qemu-9-instance-0000000d.
Sep 30 07:25:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:25:42.985 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[3d6069a5-3dc4-46b8-8400-fdf4b94da7fe]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:25:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:25:42.989 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[ddb7aceb-f439-49d0-a4aa-c95d7ed8f9e2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:25:42 compute-0 NetworkManager[51813]: <info>  [1759217142.9908] manager: (tap0a07ba3d-40): new Veth device (/org/freedesktop/NetworkManager/Devices/49)
Sep 30 07:25:42 compute-0 systemd-udevd[216974]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:25:43.013 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[8d1eed30-6efe-4f6f-8b3c-ac55b5f89174]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:25:43.015 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[3ba394cf-1e1c-4f70-9c1f-4df2afe75ead]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:25:43 compute-0 NetworkManager[51813]: <info>  [1759217143.0371] device (tap0a07ba3d-40): carrier: link connected
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:25:43.043 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[a68bd69a-e270-49d2-8bfb-dce7c99740af]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:25:43.058 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[b5ba4d40-3317-49c3-a503-8f7090c8a2d1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0a07ba3d-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:7d:c8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 500075, 'reachable_time': 28912, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217002, 'error': None, 'target': 'ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:25:43.071 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[aa8de7d3-cc59-46e5-b7be-d170a6340db0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea7:7dc8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 500075, 'tstamp': 500075}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217003, 'error': None, 'target': 'ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:25:43.082 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[5249a9e6-fb4d-4c88-8179-4e83a4f79751]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0a07ba3d-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:7d:c8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 500075, 'reachable_time': 28912, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217004, 'error': None, 'target': 'ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:25:43.106 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[6f673cbb-56d8-48aa-a4f2-518487d4c053]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:25:43.161 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[2bca6b1d-67da-406a-8383-8077437527bc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:25:43.162 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0a07ba3d-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:25:43.163 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:25:43.163 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0a07ba3d-40, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:25:43 compute-0 nova_compute[189265]: 2025-09-30 07:25:43.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:25:43 compute-0 kernel: tap0a07ba3d-40: entered promiscuous mode
Sep 30 07:25:43 compute-0 NetworkManager[51813]: <info>  [1759217143.1659] manager: (tap0a07ba3d-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/50)
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:25:43.169 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0a07ba3d-40, col_values=(('external_ids', {'iface-id': '8b5421e3-6f92-4b98-bc3c-4670813d915c'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:25:43 compute-0 ovn_controller[91436]: 2025-09-30T07:25:43Z|00136|binding|INFO|Releasing lport 8b5421e3-6f92-4b98-bc3c-4670813d915c from this chassis (sb_readonly=0)
Sep 30 07:25:43 compute-0 nova_compute[189265]: 2025-09-30 07:25:43.170 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:25:43.172 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[042f8895-811b-4db4-9032-efa9a4d9e6f1]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:25:43.172 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0a07ba3d-468f-4279-9be2-b3ef141df6a7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0a07ba3d-468f-4279-9be2-b3ef141df6a7.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:25:43.172 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0a07ba3d-468f-4279-9be2-b3ef141df6a7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0a07ba3d-468f-4279-9be2-b3ef141df6a7.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:25:43.173 100322 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for 0a07ba3d-468f-4279-9be2-b3ef141df6a7 disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:25:43.173 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0a07ba3d-468f-4279-9be2-b3ef141df6a7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0a07ba3d-468f-4279-9be2-b3ef141df6a7.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:25:43.173 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[7a83ad04-1ab8-4556-a8e0-d778190493dc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:25:43.173 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0a07ba3d-468f-4279-9be2-b3ef141df6a7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0a07ba3d-468f-4279-9be2-b3ef141df6a7.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:25:43.173 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[afe1d19b-feb9-4318-91db-55aba9cfa220]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:25:43.174 100322 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]: global
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]:     log         /dev/log local0 debug
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]:     log-tag     haproxy-metadata-proxy-0a07ba3d-468f-4279-9be2-b3ef141df6a7
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]:     user        root
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]:     group       root
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]:     maxconn     1024
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]:     pidfile     /var/lib/neutron/external/pids/0a07ba3d-468f-4279-9be2-b3ef141df6a7.pid.haproxy
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]:     daemon
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]: 
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]: defaults
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]:     log global
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]:     mode http
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]:     option httplog
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]:     option dontlognull
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]:     option http-server-close
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]:     option forwardfor
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]:     retries                 3
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]:     timeout http-request    30s
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]:     timeout connect         30s
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]:     timeout client          32s
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]:     timeout server          32s
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]:     timeout http-keep-alive 30s
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]: 
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]: listen listener
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]:     bind 169.254.169.254:80
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]:     
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]: 
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]:     http-request add-header X-OVN-Network-ID 0a07ba3d-468f-4279-9be2-b3ef141df6a7
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:25:43.174 100322 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7', 'env', 'PROCESS_TAG=haproxy-0a07ba3d-468f-4279-9be2-b3ef141df6a7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0a07ba3d-468f-4279-9be2-b3ef141df6a7.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Sep 30 07:25:43 compute-0 nova_compute[189265]: 2025-09-30 07:25:43.181 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:25:43 compute-0 nova_compute[189265]: 2025-09-30 07:25:43.298 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:25:43 compute-0 podman[217043]: 2025-09-30 07:25:43.549759191 +0000 UTC m=+0.056992208 container create 02fa1edbadd5e66dcae84b82abf37acd38b948cd9f7ec1bf91b6e568dfdd9621 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Sep 30 07:25:43 compute-0 systemd[1]: Started libpod-conmon-02fa1edbadd5e66dcae84b82abf37acd38b948cd9f7ec1bf91b6e568dfdd9621.scope.
Sep 30 07:25:43 compute-0 systemd[1]: Started libcrun container.
Sep 30 07:25:43 compute-0 podman[217043]: 2025-09-30 07:25:43.515255584 +0000 UTC m=+0.022488631 image pull eeebcc09bc72f81ab45f5ab87eb8f6a7b554b949227aeec082bdb0732754ddc8 38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 07:25:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a4b884bf6ce7bff06ed9e8e4a936a76b49c90a435c925d242313a21f97f3dc53/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 07:25:43 compute-0 podman[217043]: 2025-09-30 07:25:43.625583322 +0000 UTC m=+0.132816339 container init 02fa1edbadd5e66dcae84b82abf37acd38b948cd9f7ec1bf91b6e568dfdd9621 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS)
Sep 30 07:25:43 compute-0 podman[217043]: 2025-09-30 07:25:43.630679739 +0000 UTC m=+0.137912756 container start 02fa1edbadd5e66dcae84b82abf37acd38b948cd9f7ec1bf91b6e568dfdd9621 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0)
Sep 30 07:25:43 compute-0 nova_compute[189265]: 2025-09-30 07:25:43.638 2 DEBUG nova.compute.manager [req-5252e93c-d6cc-47e0-800e-b19255e35fe6 req-1dd655fe-a838-4462-bfa1-5cffaf484459 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Received event network-vif-plugged-a937ddac-5b07-4fc5-8b58-8cc93dee8cac external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:25:43 compute-0 nova_compute[189265]: 2025-09-30 07:25:43.639 2 DEBUG oslo_concurrency.lockutils [req-5252e93c-d6cc-47e0-800e-b19255e35fe6 req-1dd655fe-a838-4462-bfa1-5cffaf484459 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "de990aa4-7e40-416a-8e7b-ebf90847bb68-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:25:43 compute-0 nova_compute[189265]: 2025-09-30 07:25:43.639 2 DEBUG oslo_concurrency.lockutils [req-5252e93c-d6cc-47e0-800e-b19255e35fe6 req-1dd655fe-a838-4462-bfa1-5cffaf484459 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "de990aa4-7e40-416a-8e7b-ebf90847bb68-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:25:43 compute-0 nova_compute[189265]: 2025-09-30 07:25:43.640 2 DEBUG oslo_concurrency.lockutils [req-5252e93c-d6cc-47e0-800e-b19255e35fe6 req-1dd655fe-a838-4462-bfa1-5cffaf484459 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "de990aa4-7e40-416a-8e7b-ebf90847bb68-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:25:43 compute-0 nova_compute[189265]: 2025-09-30 07:25:43.640 2 DEBUG nova.compute.manager [req-5252e93c-d6cc-47e0-800e-b19255e35fe6 req-1dd655fe-a838-4462-bfa1-5cffaf484459 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Processing event network-vif-plugged-a937ddac-5b07-4fc5-8b58-8cc93dee8cac _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 07:25:43 compute-0 nova_compute[189265]: 2025-09-30 07:25:43.640 2 DEBUG nova.compute.manager [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 07:25:43 compute-0 nova_compute[189265]: 2025-09-30 07:25:43.651 2 DEBUG nova.virt.libvirt.driver [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Sep 30 07:25:43 compute-0 neutron-haproxy-ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7[217058]: [NOTICE]   (217062) : New worker (217064) forked
Sep 30 07:25:43 compute-0 neutron-haproxy-ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7[217058]: [NOTICE]   (217062) : Loading success.
Sep 30 07:25:43 compute-0 nova_compute[189265]: 2025-09-30 07:25:43.655 2 INFO nova.virt.libvirt.driver [-] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Instance spawned successfully.
Sep 30 07:25:43 compute-0 nova_compute[189265]: 2025-09-30 07:25:43.656 2 DEBUG nova.virt.libvirt.driver [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:25:43.699 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '1a:26:7c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2e:60:fa:91:d0:34'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:25:43 compute-0 nova_compute[189265]: 2025-09-30 07:25:43.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:25:43 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:25:43.700 100322 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 07:25:43 compute-0 nova_compute[189265]: 2025-09-30 07:25:43.813 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:25:43 compute-0 nova_compute[189265]: 2025-09-30 07:25:43.814 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:25:43 compute-0 nova_compute[189265]: 2025-09-30 07:25:43.814 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:25:43 compute-0 nova_compute[189265]: 2025-09-30 07:25:43.815 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:25:44 compute-0 nova_compute[189265]: 2025-09-30 07:25:44.172 2 DEBUG nova.virt.libvirt.driver [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:25:44 compute-0 nova_compute[189265]: 2025-09-30 07:25:44.173 2 DEBUG nova.virt.libvirt.driver [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:25:44 compute-0 nova_compute[189265]: 2025-09-30 07:25:44.174 2 DEBUG nova.virt.libvirt.driver [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:25:44 compute-0 nova_compute[189265]: 2025-09-30 07:25:44.175 2 DEBUG nova.virt.libvirt.driver [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:25:44 compute-0 nova_compute[189265]: 2025-09-30 07:25:44.176 2 DEBUG nova.virt.libvirt.driver [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:25:44 compute-0 nova_compute[189265]: 2025-09-30 07:25:44.177 2 DEBUG nova.virt.libvirt.driver [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:25:44 compute-0 nova_compute[189265]: 2025-09-30 07:25:44.687 2 INFO nova.compute.manager [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Took 10.83 seconds to spawn the instance on the hypervisor.
Sep 30 07:25:44 compute-0 nova_compute[189265]: 2025-09-30 07:25:44.689 2 DEBUG nova.compute.manager [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 07:25:44 compute-0 nova_compute[189265]: 2025-09-30 07:25:44.757 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:25:44 compute-0 nova_compute[189265]: 2025-09-30 07:25:44.861 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/de990aa4-7e40-416a-8e7b-ebf90847bb68/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:25:44 compute-0 nova_compute[189265]: 2025-09-30 07:25:44.944 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/de990aa4-7e40-416a-8e7b-ebf90847bb68/disk --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:25:44 compute-0 nova_compute[189265]: 2025-09-30 07:25:44.945 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/de990aa4-7e40-416a-8e7b-ebf90847bb68/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:25:45 compute-0 nova_compute[189265]: 2025-09-30 07:25:45.015 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/de990aa4-7e40-416a-8e7b-ebf90847bb68/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:25:45 compute-0 nova_compute[189265]: 2025-09-30 07:25:45.179 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:25:45 compute-0 nova_compute[189265]: 2025-09-30 07:25:45.181 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:25:45 compute-0 nova_compute[189265]: 2025-09-30 07:25:45.202 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.022s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:25:45 compute-0 nova_compute[189265]: 2025-09-30 07:25:45.203 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5792MB free_disk=73.3030891418457GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:25:45 compute-0 nova_compute[189265]: 2025-09-30 07:25:45.204 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:25:45 compute-0 nova_compute[189265]: 2025-09-30 07:25:45.204 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:25:45 compute-0 nova_compute[189265]: 2025-09-30 07:25:45.314 2 INFO nova.compute.manager [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Took 16.18 seconds to build instance.
Sep 30 07:25:45 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:25:45.702 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=01429670-4ea1-4dab-babc-4bc628cc01bb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:25:45 compute-0 nova_compute[189265]: 2025-09-30 07:25:45.714 2 DEBUG nova.compute.manager [req-10221158-d4cb-43a6-a55d-f13d1ed0fa77 req-c660803f-22fa-4602-9f72-0819217f184f 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Received event network-vif-plugged-a937ddac-5b07-4fc5-8b58-8cc93dee8cac external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:25:45 compute-0 nova_compute[189265]: 2025-09-30 07:25:45.715 2 DEBUG oslo_concurrency.lockutils [req-10221158-d4cb-43a6-a55d-f13d1ed0fa77 req-c660803f-22fa-4602-9f72-0819217f184f 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "de990aa4-7e40-416a-8e7b-ebf90847bb68-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:25:45 compute-0 nova_compute[189265]: 2025-09-30 07:25:45.716 2 DEBUG oslo_concurrency.lockutils [req-10221158-d4cb-43a6-a55d-f13d1ed0fa77 req-c660803f-22fa-4602-9f72-0819217f184f 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "de990aa4-7e40-416a-8e7b-ebf90847bb68-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:25:45 compute-0 nova_compute[189265]: 2025-09-30 07:25:45.716 2 DEBUG oslo_concurrency.lockutils [req-10221158-d4cb-43a6-a55d-f13d1ed0fa77 req-c660803f-22fa-4602-9f72-0819217f184f 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "de990aa4-7e40-416a-8e7b-ebf90847bb68-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:25:45 compute-0 nova_compute[189265]: 2025-09-30 07:25:45.717 2 DEBUG nova.compute.manager [req-10221158-d4cb-43a6-a55d-f13d1ed0fa77 req-c660803f-22fa-4602-9f72-0819217f184f 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] No waiting events found dispatching network-vif-plugged-a937ddac-5b07-4fc5-8b58-8cc93dee8cac pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:25:45 compute-0 nova_compute[189265]: 2025-09-30 07:25:45.717 2 WARNING nova.compute.manager [req-10221158-d4cb-43a6-a55d-f13d1ed0fa77 req-c660803f-22fa-4602-9f72-0819217f184f 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Received unexpected event network-vif-plugged-a937ddac-5b07-4fc5-8b58-8cc93dee8cac for instance with vm_state active and task_state None.
Sep 30 07:25:45 compute-0 nova_compute[189265]: 2025-09-30 07:25:45.829 2 DEBUG oslo_concurrency.lockutils [None req-b6afc04d-12fa-48f8-be82-31f7c5202d06 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lock "de990aa4-7e40-416a-8e7b-ebf90847bb68" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.708s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:25:46 compute-0 nova_compute[189265]: 2025-09-30 07:25:46.267 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Instance de990aa4-7e40-416a-8e7b-ebf90847bb68 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 07:25:46 compute-0 nova_compute[189265]: 2025-09-30 07:25:46.268 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:25:46 compute-0 nova_compute[189265]: 2025-09-30 07:25:46.268 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:25:45 up  1:23,  0 user,  load average: 0.24, 0.22, 0.35\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_2ad7bd988b6047509c2c19eb4e0dc32c': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:25:46 compute-0 nova_compute[189265]: 2025-09-30 07:25:46.323 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:25:46 compute-0 nova_compute[189265]: 2025-09-30 07:25:46.832 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:25:47 compute-0 nova_compute[189265]: 2025-09-30 07:25:47.344 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:25:47 compute-0 nova_compute[189265]: 2025-09-30 07:25:47.345 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.140s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:25:47 compute-0 nova_compute[189265]: 2025-09-30 07:25:47.810 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:25:48 compute-0 nova_compute[189265]: 2025-09-30 07:25:48.834 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:25:48 compute-0 nova_compute[189265]: 2025-09-30 07:25:48.835 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:25:49 compute-0 podman[217082]: 2025-09-30 07:25:49.479415678 +0000 UTC m=+0.060282813 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 07:25:49 compute-0 nova_compute[189265]: 2025-09-30 07:25:49.759 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:25:51 compute-0 nova_compute[189265]: 2025-09-30 07:25:51.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:25:52 compute-0 nova_compute[189265]: 2025-09-30 07:25:52.812 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:25:54 compute-0 nova_compute[189265]: 2025-09-30 07:25:54.764 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:25:55 compute-0 ovn_controller[91436]: 2025-09-30T07:25:55Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f7:c8:c7 10.100.0.5
Sep 30 07:25:55 compute-0 ovn_controller[91436]: 2025-09-30T07:25:55Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f7:c8:c7 10.100.0.5
Sep 30 07:25:57 compute-0 nova_compute[189265]: 2025-09-30 07:25:57.813 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:25:58 compute-0 podman[217118]: 2025-09-30 07:25:58.464751934 +0000 UTC m=+0.052816047 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, container_name=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 07:25:58 compute-0 nova_compute[189265]: 2025-09-30 07:25:58.659 2 DEBUG nova.virt.libvirt.driver [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: ba50db09-103a-463d-9b29-917488cc4974] Creating tmpfile /var/lib/nova/instances/tmpxoxgb2p4 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Sep 30 07:25:58 compute-0 nova_compute[189265]: 2025-09-30 07:25:58.660 2 WARNING neutronclient.v2_0.client [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:25:58 compute-0 nova_compute[189265]: 2025-09-30 07:25:58.663 2 DEBUG nova.compute.manager [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpxoxgb2p4',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Sep 30 07:25:59 compute-0 podman[199733]: time="2025-09-30T07:25:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:25:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:25:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 07:25:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:25:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3470 "" "Go-http-client/1.1"
Sep 30 07:25:59 compute-0 nova_compute[189265]: 2025-09-30 07:25:59.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:00 compute-0 nova_compute[189265]: 2025-09-30 07:26:00.699 2 WARNING neutronclient.v2_0.client [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:26:01 compute-0 openstack_network_exporter[201859]: ERROR   07:26:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:26:01 compute-0 openstack_network_exporter[201859]: ERROR   07:26:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:26:01 compute-0 openstack_network_exporter[201859]: ERROR   07:26:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:26:01 compute-0 openstack_network_exporter[201859]: ERROR   07:26:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:26:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:26:01 compute-0 openstack_network_exporter[201859]: ERROR   07:26:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:26:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:26:01 compute-0 podman[217139]: 2025-09-30 07:26:01.500221097 +0000 UTC m=+0.076925808 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., managed_by=edpm_ansible, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, distribution-scope=public, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, architecture=x86_64, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Sep 30 07:26:02 compute-0 nova_compute[189265]: 2025-09-30 07:26:02.816 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:04 compute-0 nova_compute[189265]: 2025-09-30 07:26:04.768 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:05 compute-0 nova_compute[189265]: 2025-09-30 07:26:05.384 2 DEBUG nova.compute.manager [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpxoxgb2p4',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='ba50db09-103a-463d-9b29-917488cc4974',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Sep 30 07:26:05 compute-0 podman[217160]: 2025-09-30 07:26:05.482811481 +0000 UTC m=+0.068661573 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest)
Sep 30 07:26:05 compute-0 podman[217161]: 2025-09-30 07:26:05.50248109 +0000 UTC m=+0.077641598 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Sep 30 07:26:05 compute-0 podman[217162]: 2025-09-30 07:26:05.538243106 +0000 UTC m=+0.119328833 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_id=ovn_controller)
Sep 30 07:26:06 compute-0 nova_compute[189265]: 2025-09-30 07:26:06.400 2 DEBUG oslo_concurrency.lockutils [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "refresh_cache-ba50db09-103a-463d-9b29-917488cc4974" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:26:06 compute-0 nova_compute[189265]: 2025-09-30 07:26:06.400 2 DEBUG oslo_concurrency.lockutils [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquired lock "refresh_cache-ba50db09-103a-463d-9b29-917488cc4974" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:26:06 compute-0 nova_compute[189265]: 2025-09-30 07:26:06.400 2 DEBUG nova.network.neutron [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: ba50db09-103a-463d-9b29-917488cc4974] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 07:26:06 compute-0 nova_compute[189265]: 2025-09-30 07:26:06.909 2 WARNING neutronclient.v2_0.client [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:26:07 compute-0 nova_compute[189265]: 2025-09-30 07:26:07.841 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:08 compute-0 nova_compute[189265]: 2025-09-30 07:26:08.506 2 WARNING neutronclient.v2_0.client [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:26:08 compute-0 nova_compute[189265]: 2025-09-30 07:26:08.700 2 DEBUG nova.network.neutron [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: ba50db09-103a-463d-9b29-917488cc4974] Updating instance_info_cache with network_info: [{"id": "51b727ab-5fda-4e00-aef4-af6f5d5601f9", "address": "fa:16:3e:1e:a9:8c", "network": {"id": "0a07ba3d-468f-4279-9be2-b3ef141df6a7", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-465825729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02a4831cb362481d98b354ed3bf2d113", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51b727ab-5f", "ovs_interfaceid": "51b727ab-5fda-4e00-aef4-af6f5d5601f9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:26:09 compute-0 nova_compute[189265]: 2025-09-30 07:26:09.207 2 DEBUG oslo_concurrency.lockutils [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Releasing lock "refresh_cache-ba50db09-103a-463d-9b29-917488cc4974" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:26:09 compute-0 nova_compute[189265]: 2025-09-30 07:26:09.225 2 DEBUG nova.virt.libvirt.driver [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: ba50db09-103a-463d-9b29-917488cc4974] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpxoxgb2p4',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='ba50db09-103a-463d-9b29-917488cc4974',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Sep 30 07:26:09 compute-0 nova_compute[189265]: 2025-09-30 07:26:09.226 2 DEBUG nova.virt.libvirt.driver [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: ba50db09-103a-463d-9b29-917488cc4974] Creating instance directory: /var/lib/nova/instances/ba50db09-103a-463d-9b29-917488cc4974 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Sep 30 07:26:09 compute-0 nova_compute[189265]: 2025-09-30 07:26:09.226 2 DEBUG nova.virt.libvirt.driver [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: ba50db09-103a-463d-9b29-917488cc4974] Creating disk.info with the contents: {'/var/lib/nova/instances/ba50db09-103a-463d-9b29-917488cc4974/disk': 'qcow2', '/var/lib/nova/instances/ba50db09-103a-463d-9b29-917488cc4974/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Sep 30 07:26:09 compute-0 nova_compute[189265]: 2025-09-30 07:26:09.227 2 DEBUG nova.virt.libvirt.driver [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: ba50db09-103a-463d-9b29-917488cc4974] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Sep 30 07:26:09 compute-0 nova_compute[189265]: 2025-09-30 07:26:09.228 2 DEBUG nova.objects.instance [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lazy-loading 'trusted_certs' on Instance uuid ba50db09-103a-463d-9b29-917488cc4974 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:26:09 compute-0 nova_compute[189265]: 2025-09-30 07:26:09.738 2 DEBUG oslo_utils.imageutils.format_inspector [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:26:09 compute-0 nova_compute[189265]: 2025-09-30 07:26:09.744 2 DEBUG oslo_utils.imageutils.format_inspector [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:26:09 compute-0 nova_compute[189265]: 2025-09-30 07:26:09.747 2 DEBUG oslo_concurrency.processutils [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:26:09 compute-0 nova_compute[189265]: 2025-09-30 07:26:09.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:09 compute-0 nova_compute[189265]: 2025-09-30 07:26:09.799 2 DEBUG oslo_concurrency.processutils [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:26:09 compute-0 nova_compute[189265]: 2025-09-30 07:26:09.800 2 DEBUG oslo_concurrency.lockutils [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "649c128805005f3dfb5a93843c58a367cdfe939d" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:26:09 compute-0 nova_compute[189265]: 2025-09-30 07:26:09.801 2 DEBUG oslo_concurrency.lockutils [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "649c128805005f3dfb5a93843c58a367cdfe939d" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:26:09 compute-0 nova_compute[189265]: 2025-09-30 07:26:09.802 2 DEBUG oslo_utils.imageutils.format_inspector [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:26:09 compute-0 nova_compute[189265]: 2025-09-30 07:26:09.808 2 DEBUG oslo_utils.imageutils.format_inspector [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:26:09 compute-0 nova_compute[189265]: 2025-09-30 07:26:09.809 2 DEBUG oslo_concurrency.processutils [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:26:09 compute-0 nova_compute[189265]: 2025-09-30 07:26:09.862 2 DEBUG oslo_concurrency.processutils [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:26:09 compute-0 nova_compute[189265]: 2025-09-30 07:26:09.864 2 DEBUG oslo_concurrency.processutils [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d,backing_fmt=raw /var/lib/nova/instances/ba50db09-103a-463d-9b29-917488cc4974/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:26:09 compute-0 nova_compute[189265]: 2025-09-30 07:26:09.915 2 DEBUG oslo_concurrency.processutils [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d,backing_fmt=raw /var/lib/nova/instances/ba50db09-103a-463d-9b29-917488cc4974/disk 1073741824" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:26:09 compute-0 nova_compute[189265]: 2025-09-30 07:26:09.916 2 DEBUG oslo_concurrency.lockutils [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "649c128805005f3dfb5a93843c58a367cdfe939d" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.115s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:26:09 compute-0 nova_compute[189265]: 2025-09-30 07:26:09.917 2 DEBUG oslo_concurrency.processutils [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:26:09 compute-0 nova_compute[189265]: 2025-09-30 07:26:09.985 2 DEBUG oslo_concurrency.processutils [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:26:09 compute-0 nova_compute[189265]: 2025-09-30 07:26:09.986 2 DEBUG nova.virt.disk.api [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Checking if we can resize image /var/lib/nova/instances/ba50db09-103a-463d-9b29-917488cc4974/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Sep 30 07:26:09 compute-0 nova_compute[189265]: 2025-09-30 07:26:09.987 2 DEBUG oslo_concurrency.processutils [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ba50db09-103a-463d-9b29-917488cc4974/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:26:10 compute-0 nova_compute[189265]: 2025-09-30 07:26:10.037 2 DEBUG oslo_concurrency.processutils [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ba50db09-103a-463d-9b29-917488cc4974/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:26:10 compute-0 nova_compute[189265]: 2025-09-30 07:26:10.038 2 DEBUG nova.virt.disk.api [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Cannot resize image /var/lib/nova/instances/ba50db09-103a-463d-9b29-917488cc4974/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Sep 30 07:26:10 compute-0 nova_compute[189265]: 2025-09-30 07:26:10.039 2 DEBUG nova.objects.instance [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lazy-loading 'migration_context' on Instance uuid ba50db09-103a-463d-9b29-917488cc4974 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:26:10 compute-0 nova_compute[189265]: 2025-09-30 07:26:10.549 2 DEBUG nova.objects.base [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Object Instance<ba50db09-103a-463d-9b29-917488cc4974> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Sep 30 07:26:10 compute-0 nova_compute[189265]: 2025-09-30 07:26:10.550 2 DEBUG oslo_concurrency.processutils [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/ba50db09-103a-463d-9b29-917488cc4974/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:26:10 compute-0 nova_compute[189265]: 2025-09-30 07:26:10.584 2 DEBUG oslo_concurrency.processutils [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/ba50db09-103a-463d-9b29-917488cc4974/disk.config 497664" returned: 0 in 0.035s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:26:10 compute-0 nova_compute[189265]: 2025-09-30 07:26:10.585 2 DEBUG nova.virt.libvirt.driver [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: ba50db09-103a-463d-9b29-917488cc4974] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Sep 30 07:26:10 compute-0 nova_compute[189265]: 2025-09-30 07:26:10.587 2 DEBUG nova.virt.libvirt.vif [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T07:25:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1193476648',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1193476648',id=12,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T07:25:22Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='2ad7bd988b6047509c2c19eb4e0dc32c',ramdisk_id='',reservation_id='r-7f0wnzmz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-385408215',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-385408215-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T07:25:22Z,user_data=None,user_id='071bf5838f2f473a865873b6f7846f84',uuid=ba50db09-103a-463d-9b29-917488cc4974,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "51b727ab-5fda-4e00-aef4-af6f5d5601f9", "address": "fa:16:3e:1e:a9:8c", "network": {"id": "0a07ba3d-468f-4279-9be2-b3ef141df6a7", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-465825729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02a4831cb362481d98b354ed3bf2d113", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap51b727ab-5f", "ovs_interfaceid": "51b727ab-5fda-4e00-aef4-af6f5d5601f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 07:26:10 compute-0 nova_compute[189265]: 2025-09-30 07:26:10.587 2 DEBUG nova.network.os_vif_util [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Converting VIF {"id": "51b727ab-5fda-4e00-aef4-af6f5d5601f9", "address": "fa:16:3e:1e:a9:8c", "network": {"id": "0a07ba3d-468f-4279-9be2-b3ef141df6a7", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-465825729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02a4831cb362481d98b354ed3bf2d113", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap51b727ab-5f", "ovs_interfaceid": "51b727ab-5fda-4e00-aef4-af6f5d5601f9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:26:10 compute-0 nova_compute[189265]: 2025-09-30 07:26:10.588 2 DEBUG nova.network.os_vif_util [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:a9:8c,bridge_name='br-int',has_traffic_filtering=True,id=51b727ab-5fda-4e00-aef4-af6f5d5601f9,network=Network(0a07ba3d-468f-4279-9be2-b3ef141df6a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap51b727ab-5f') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:26:10 compute-0 nova_compute[189265]: 2025-09-30 07:26:10.588 2 DEBUG os_vif [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:a9:8c,bridge_name='br-int',has_traffic_filtering=True,id=51b727ab-5fda-4e00-aef4-af6f5d5601f9,network=Network(0a07ba3d-468f-4279-9be2-b3ef141df6a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap51b727ab-5f') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 07:26:10 compute-0 nova_compute[189265]: 2025-09-30 07:26:10.589 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:10 compute-0 nova_compute[189265]: 2025-09-30 07:26:10.589 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:26:10 compute-0 nova_compute[189265]: 2025-09-30 07:26:10.590 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:26:10 compute-0 nova_compute[189265]: 2025-09-30 07:26:10.591 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:10 compute-0 nova_compute[189265]: 2025-09-30 07:26:10.591 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'c675c555-a17c-503b-b1a1-4ec0cc9bc7a0', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:26:10 compute-0 nova_compute[189265]: 2025-09-30 07:26:10.592 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:10 compute-0 nova_compute[189265]: 2025-09-30 07:26:10.594 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:26:10 compute-0 nova_compute[189265]: 2025-09-30 07:26:10.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:10 compute-0 nova_compute[189265]: 2025-09-30 07:26:10.598 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap51b727ab-5f, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:26:10 compute-0 nova_compute[189265]: 2025-09-30 07:26:10.598 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap51b727ab-5f, col_values=(('qos', UUID('b7a501ee-ca91-4195-b53c-ea7c04b42d43')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:26:10 compute-0 nova_compute[189265]: 2025-09-30 07:26:10.599 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap51b727ab-5f, col_values=(('external_ids', {'iface-id': '51b727ab-5fda-4e00-aef4-af6f5d5601f9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1e:a9:8c', 'vm-uuid': 'ba50db09-103a-463d-9b29-917488cc4974'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:26:10 compute-0 nova_compute[189265]: 2025-09-30 07:26:10.600 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:10 compute-0 NetworkManager[51813]: <info>  [1759217170.6019] manager: (tap51b727ab-5f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/51)
Sep 30 07:26:10 compute-0 nova_compute[189265]: 2025-09-30 07:26:10.602 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:26:10 compute-0 nova_compute[189265]: 2025-09-30 07:26:10.611 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:10 compute-0 nova_compute[189265]: 2025-09-30 07:26:10.612 2 INFO os_vif [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:a9:8c,bridge_name='br-int',has_traffic_filtering=True,id=51b727ab-5fda-4e00-aef4-af6f5d5601f9,network=Network(0a07ba3d-468f-4279-9be2-b3ef141df6a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap51b727ab-5f')
Sep 30 07:26:10 compute-0 nova_compute[189265]: 2025-09-30 07:26:10.612 2 DEBUG nova.virt.libvirt.driver [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Sep 30 07:26:10 compute-0 nova_compute[189265]: 2025-09-30 07:26:10.613 2 DEBUG nova.compute.manager [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpxoxgb2p4',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='ba50db09-103a-463d-9b29-917488cc4974',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Sep 30 07:26:10 compute-0 nova_compute[189265]: 2025-09-30 07:26:10.613 2 WARNING neutronclient.v2_0.client [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:26:10 compute-0 nova_compute[189265]: 2025-09-30 07:26:10.786 2 WARNING neutronclient.v2_0.client [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:26:11 compute-0 nova_compute[189265]: 2025-09-30 07:26:11.820 2 DEBUG nova.network.neutron [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: ba50db09-103a-463d-9b29-917488cc4974] Port 51b727ab-5fda-4e00-aef4-af6f5d5601f9 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Sep 30 07:26:11 compute-0 nova_compute[189265]: 2025-09-30 07:26:11.837 2 DEBUG nova.compute.manager [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpxoxgb2p4',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='ba50db09-103a-463d-9b29-917488cc4974',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Sep 30 07:26:12 compute-0 nova_compute[189265]: 2025-09-30 07:26:12.843 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:12 compute-0 ovn_controller[91436]: 2025-09-30T07:26:12Z|00137|memory_trim|INFO|Detected inactivity (last active 30010 ms ago): trimming memory
Sep 30 07:26:15 compute-0 kernel: tap51b727ab-5f: entered promiscuous mode
Sep 30 07:26:15 compute-0 NetworkManager[51813]: <info>  [1759217175.0018] manager: (tap51b727ab-5f): new Tun device (/org/freedesktop/NetworkManager/Devices/52)
Sep 30 07:26:15 compute-0 ovn_controller[91436]: 2025-09-30T07:26:15Z|00138|binding|INFO|Claiming lport 51b727ab-5fda-4e00-aef4-af6f5d5601f9 for this additional chassis.
Sep 30 07:26:15 compute-0 ovn_controller[91436]: 2025-09-30T07:26:15Z|00139|binding|INFO|51b727ab-5fda-4e00-aef4-af6f5d5601f9: Claiming fa:16:3e:1e:a9:8c 10.100.0.6
Sep 30 07:26:15 compute-0 nova_compute[189265]: 2025-09-30 07:26:15.003 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:15 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:26:15.014 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:a9:8c 10.100.0.6'], port_security=['fa:16:3e:1e:a9:8c 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'ba50db09-103a-463d-9b29-917488cc4974', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0a07ba3d-468f-4279-9be2-b3ef141df6a7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2ad7bd988b6047509c2c19eb4e0dc32c', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'dc82e88d-abda-4feb-bd34-afbed64798c7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ea21a402-508c-472e-bd89-e4a2e8cde5bb, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=51b727ab-5fda-4e00-aef4-af6f5d5601f9) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:26:15 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:26:15.016 100322 INFO neutron.agent.ovn.metadata.agent [-] Port 51b727ab-5fda-4e00-aef4-af6f5d5601f9 in datapath 0a07ba3d-468f-4279-9be2-b3ef141df6a7 unbound from our chassis
Sep 30 07:26:15 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:26:15.019 100322 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0a07ba3d-468f-4279-9be2-b3ef141df6a7
Sep 30 07:26:15 compute-0 ovn_controller[91436]: 2025-09-30T07:26:15Z|00140|binding|INFO|Setting lport 51b727ab-5fda-4e00-aef4-af6f5d5601f9 ovn-installed in OVS
Sep 30 07:26:15 compute-0 nova_compute[189265]: 2025-09-30 07:26:15.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:15 compute-0 nova_compute[189265]: 2025-09-30 07:26:15.034 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:15 compute-0 systemd-udevd[217253]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 07:26:15 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:26:15.040 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[08973402-f474-418e-aab0-fb704299bbcf]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:26:15 compute-0 NetworkManager[51813]: <info>  [1759217175.0572] device (tap51b727ab-5f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 07:26:15 compute-0 NetworkManager[51813]: <info>  [1759217175.0586] device (tap51b727ab-5f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 07:26:15 compute-0 systemd-machined[149233]: New machine qemu-10-instance-0000000c.
Sep 30 07:26:15 compute-0 systemd[1]: Started Virtual Machine qemu-10-instance-0000000c.
Sep 30 07:26:15 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:26:15.082 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[040e99a9-1e50-40aa-9cd4-269b1bad09d6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:26:15 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:26:15.088 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[d64b4d9b-26b7-4845-8755-d84054126e2f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:26:15 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:26:15.127 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[1a6cfac7-c8a4-4209-ad29-ed3ab6f3289d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:26:15 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:26:15.150 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[00bdd0ec-31d2-4b41-97b0-5f887bd99696]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0a07ba3d-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:7d:c8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 500075, 'reachable_time': 28912, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217268, 'error': None, 'target': 'ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:26:15 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:26:15.172 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[cb4f6992-d65b-454c-9689-2999a30556a8]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap0a07ba3d-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 500085, 'tstamp': 500085}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217270, 'error': None, 'target': 'ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap0a07ba3d-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 500087, 'tstamp': 500087}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217270, 'error': None, 'target': 'ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:26:15 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:26:15.173 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0a07ba3d-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:26:15 compute-0 nova_compute[189265]: 2025-09-30 07:26:15.174 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:15 compute-0 nova_compute[189265]: 2025-09-30 07:26:15.176 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:15 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:26:15.176 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0a07ba3d-40, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:26:15 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:26:15.176 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:26:15 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:26:15.177 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0a07ba3d-40, col_values=(('external_ids', {'iface-id': '8b5421e3-6f92-4b98-bc3c-4670813d915c'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:26:15 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:26:15.177 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:26:15 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:26:15.178 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[52cb423f-a531-478f-998d-a808e5e1e35b]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-0a07ba3d-468f-4279-9be2-b3ef141df6a7\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/0a07ba3d-468f-4279-9be2-b3ef141df6a7.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 0a07ba3d-468f-4279-9be2-b3ef141df6a7\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:26:15 compute-0 nova_compute[189265]: 2025-09-30 07:26:15.601 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:17 compute-0 ovn_controller[91436]: 2025-09-30T07:26:17Z|00141|binding|INFO|Claiming lport 51b727ab-5fda-4e00-aef4-af6f5d5601f9 for this chassis.
Sep 30 07:26:17 compute-0 ovn_controller[91436]: 2025-09-30T07:26:17Z|00142|binding|INFO|51b727ab-5fda-4e00-aef4-af6f5d5601f9: Claiming fa:16:3e:1e:a9:8c 10.100.0.6
Sep 30 07:26:17 compute-0 ovn_controller[91436]: 2025-09-30T07:26:17Z|00143|binding|INFO|Setting lport 51b727ab-5fda-4e00-aef4-af6f5d5601f9 up in Southbound
Sep 30 07:26:17 compute-0 nova_compute[189265]: 2025-09-30 07:26:17.891 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:18 compute-0 nova_compute[189265]: 2025-09-30 07:26:18.441 2 INFO nova.compute.manager [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: ba50db09-103a-463d-9b29-917488cc4974] Post operation of migration started
Sep 30 07:26:18 compute-0 nova_compute[189265]: 2025-09-30 07:26:18.442 2 WARNING neutronclient.v2_0.client [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:26:19 compute-0 nova_compute[189265]: 2025-09-30 07:26:19.529 2 WARNING neutronclient.v2_0.client [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:26:19 compute-0 nova_compute[189265]: 2025-09-30 07:26:19.530 2 WARNING neutronclient.v2_0.client [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:26:20 compute-0 podman[217292]: 2025-09-30 07:26:20.502291571 +0000 UTC m=+0.078983986 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 07:26:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:26:20.554 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:26:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:26:20.555 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:26:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:26:20.556 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:26:20 compute-0 nova_compute[189265]: 2025-09-30 07:26:20.569 2 DEBUG oslo_concurrency.lockutils [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "refresh_cache-ba50db09-103a-463d-9b29-917488cc4974" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:26:20 compute-0 nova_compute[189265]: 2025-09-30 07:26:20.569 2 DEBUG oslo_concurrency.lockutils [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquired lock "refresh_cache-ba50db09-103a-463d-9b29-917488cc4974" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:26:20 compute-0 nova_compute[189265]: 2025-09-30 07:26:20.569 2 DEBUG nova.network.neutron [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: ba50db09-103a-463d-9b29-917488cc4974] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 07:26:20 compute-0 nova_compute[189265]: 2025-09-30 07:26:20.602 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:21 compute-0 nova_compute[189265]: 2025-09-30 07:26:21.076 2 WARNING neutronclient.v2_0.client [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:26:22 compute-0 nova_compute[189265]: 2025-09-30 07:26:22.948 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:23 compute-0 nova_compute[189265]: 2025-09-30 07:26:23.514 2 WARNING neutronclient.v2_0.client [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:26:23 compute-0 nova_compute[189265]: 2025-09-30 07:26:23.672 2 DEBUG nova.network.neutron [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: ba50db09-103a-463d-9b29-917488cc4974] Updating instance_info_cache with network_info: [{"id": "51b727ab-5fda-4e00-aef4-af6f5d5601f9", "address": "fa:16:3e:1e:a9:8c", "network": {"id": "0a07ba3d-468f-4279-9be2-b3ef141df6a7", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-465825729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02a4831cb362481d98b354ed3bf2d113", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51b727ab-5f", "ovs_interfaceid": "51b727ab-5fda-4e00-aef4-af6f5d5601f9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:26:24 compute-0 nova_compute[189265]: 2025-09-30 07:26:24.286 2 DEBUG oslo_concurrency.lockutils [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Releasing lock "refresh_cache-ba50db09-103a-463d-9b29-917488cc4974" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:26:24 compute-0 nova_compute[189265]: 2025-09-30 07:26:24.808 2 DEBUG oslo_concurrency.lockutils [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:26:24 compute-0 nova_compute[189265]: 2025-09-30 07:26:24.809 2 DEBUG oslo_concurrency.lockutils [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:26:24 compute-0 nova_compute[189265]: 2025-09-30 07:26:24.809 2 DEBUG oslo_concurrency.lockutils [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:26:24 compute-0 nova_compute[189265]: 2025-09-30 07:26:24.813 2 INFO nova.virt.libvirt.driver [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: ba50db09-103a-463d-9b29-917488cc4974] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Sep 30 07:26:24 compute-0 virtqemud[189090]: Domain id=10 name='instance-0000000c' uuid=ba50db09-103a-463d-9b29-917488cc4974 is tainted: custom-monitor
Sep 30 07:26:25 compute-0 nova_compute[189265]: 2025-09-30 07:26:25.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:25 compute-0 nova_compute[189265]: 2025-09-30 07:26:25.821 2 INFO nova.virt.libvirt.driver [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: ba50db09-103a-463d-9b29-917488cc4974] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Sep 30 07:26:26 compute-0 nova_compute[189265]: 2025-09-30 07:26:26.828 2 INFO nova.virt.libvirt.driver [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: ba50db09-103a-463d-9b29-917488cc4974] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Sep 30 07:26:26 compute-0 nova_compute[189265]: 2025-09-30 07:26:26.833 2 DEBUG nova.compute.manager [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: ba50db09-103a-463d-9b29-917488cc4974] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 07:26:27 compute-0 sshd-session[217317]: Invalid user user from 80.94.95.116 port 46608
Sep 30 07:26:27 compute-0 sshd-session[217317]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:26:27 compute-0 sshd-session[217317]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.95.116
Sep 30 07:26:27 compute-0 nova_compute[189265]: 2025-09-30 07:26:27.348 2 DEBUG nova.objects.instance [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: ba50db09-103a-463d-9b29-917488cc4974] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Sep 30 07:26:27 compute-0 nova_compute[189265]: 2025-09-30 07:26:27.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:28 compute-0 nova_compute[189265]: 2025-09-30 07:26:28.370 2 WARNING neutronclient.v2_0.client [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:26:28 compute-0 nova_compute[189265]: 2025-09-30 07:26:28.534 2 WARNING neutronclient.v2_0.client [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:26:28 compute-0 nova_compute[189265]: 2025-09-30 07:26:28.534 2 WARNING neutronclient.v2_0.client [None req-bc582006-82af-47d4-98b0-fb0e37d1cc22 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:26:28 compute-0 sshd-session[217317]: Failed password for invalid user user from 80.94.95.116 port 46608 ssh2
Sep 30 07:26:29 compute-0 podman[217319]: 2025-09-30 07:26:29.516988598 +0000 UTC m=+0.083262358 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible)
Sep 30 07:26:29 compute-0 sshd-session[217317]: Connection closed by invalid user user 80.94.95.116 port 46608 [preauth]
Sep 30 07:26:29 compute-0 podman[199733]: time="2025-09-30T07:26:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:26:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:26:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 07:26:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:26:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3478 "" "Go-http-client/1.1"
Sep 30 07:26:30 compute-0 nova_compute[189265]: 2025-09-30 07:26:30.606 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:31 compute-0 openstack_network_exporter[201859]: ERROR   07:26:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:26:31 compute-0 openstack_network_exporter[201859]: ERROR   07:26:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:26:31 compute-0 openstack_network_exporter[201859]: ERROR   07:26:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:26:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:26:31 compute-0 openstack_network_exporter[201859]: ERROR   07:26:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:26:31 compute-0 openstack_network_exporter[201859]: ERROR   07:26:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:26:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:26:32 compute-0 podman[217339]: 2025-09-30 07:26:32.515803225 +0000 UTC m=+0.092831950 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.buildah.version=1.33.7, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, managed_by=edpm_ansible)
Sep 30 07:26:32 compute-0 nova_compute[189265]: 2025-09-30 07:26:32.951 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:33 compute-0 nova_compute[189265]: 2025-09-30 07:26:33.756 2 DEBUG oslo_concurrency.lockutils [None req-10c89dce-1098-4260-8753-e405620065b2 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Acquiring lock "de990aa4-7e40-416a-8e7b-ebf90847bb68" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:26:33 compute-0 nova_compute[189265]: 2025-09-30 07:26:33.757 2 DEBUG oslo_concurrency.lockutils [None req-10c89dce-1098-4260-8753-e405620065b2 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lock "de990aa4-7e40-416a-8e7b-ebf90847bb68" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:26:33 compute-0 nova_compute[189265]: 2025-09-30 07:26:33.757 2 DEBUG oslo_concurrency.lockutils [None req-10c89dce-1098-4260-8753-e405620065b2 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Acquiring lock "de990aa4-7e40-416a-8e7b-ebf90847bb68-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:26:33 compute-0 nova_compute[189265]: 2025-09-30 07:26:33.757 2 DEBUG oslo_concurrency.lockutils [None req-10c89dce-1098-4260-8753-e405620065b2 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lock "de990aa4-7e40-416a-8e7b-ebf90847bb68-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:26:33 compute-0 nova_compute[189265]: 2025-09-30 07:26:33.758 2 DEBUG oslo_concurrency.lockutils [None req-10c89dce-1098-4260-8753-e405620065b2 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lock "de990aa4-7e40-416a-8e7b-ebf90847bb68-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:26:33 compute-0 nova_compute[189265]: 2025-09-30 07:26:33.774 2 INFO nova.compute.manager [None req-10c89dce-1098-4260-8753-e405620065b2 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Terminating instance
Sep 30 07:26:34 compute-0 nova_compute[189265]: 2025-09-30 07:26:34.292 2 DEBUG nova.compute.manager [None req-10c89dce-1098-4260-8753-e405620065b2 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 07:26:34 compute-0 kernel: tapa937ddac-5b (unregistering): left promiscuous mode
Sep 30 07:26:34 compute-0 NetworkManager[51813]: <info>  [1759217194.3171] device (tapa937ddac-5b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 07:26:34 compute-0 ovn_controller[91436]: 2025-09-30T07:26:34Z|00144|binding|INFO|Releasing lport a937ddac-5b07-4fc5-8b58-8cc93dee8cac from this chassis (sb_readonly=0)
Sep 30 07:26:34 compute-0 ovn_controller[91436]: 2025-09-30T07:26:34Z|00145|binding|INFO|Setting lport a937ddac-5b07-4fc5-8b58-8cc93dee8cac down in Southbound
Sep 30 07:26:34 compute-0 ovn_controller[91436]: 2025-09-30T07:26:34Z|00146|binding|INFO|Removing iface tapa937ddac-5b ovn-installed in OVS
Sep 30 07:26:34 compute-0 nova_compute[189265]: 2025-09-30 07:26:34.329 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:34 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:26:34.338 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f7:c8:c7 10.100.0.5'], port_security=['fa:16:3e:f7:c8:c7 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'de990aa4-7e40-416a-8e7b-ebf90847bb68', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0a07ba3d-468f-4279-9be2-b3ef141df6a7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2ad7bd988b6047509c2c19eb4e0dc32c', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'dc82e88d-abda-4feb-bd34-afbed64798c7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ea21a402-508c-472e-bd89-e4a2e8cde5bb, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], logical_port=a937ddac-5b07-4fc5-8b58-8cc93dee8cac) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:26:34 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:26:34.339 100322 INFO neutron.agent.ovn.metadata.agent [-] Port a937ddac-5b07-4fc5-8b58-8cc93dee8cac in datapath 0a07ba3d-468f-4279-9be2-b3ef141df6a7 unbound from our chassis
Sep 30 07:26:34 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:26:34.342 100322 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0a07ba3d-468f-4279-9be2-b3ef141df6a7
Sep 30 07:26:34 compute-0 nova_compute[189265]: 2025-09-30 07:26:34.360 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:34 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:26:34.369 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[50e7e4ae-43f9-461c-abf7-3a87ca58c7c5]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:26:34 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:26:34.413 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[faf47a1c-1535-4af7-a9a7-790de316f275]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:26:34 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:26:34.415 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[c9fa923b-2152-4b33-b6a9-e624261793ac]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:26:34 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Sep 30 07:26:34 compute-0 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000000d.scope: Consumed 13.894s CPU time.
Sep 30 07:26:34 compute-0 systemd-machined[149233]: Machine qemu-9-instance-0000000d terminated.
Sep 30 07:26:34 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:26:34.453 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[fb2e2c92-f062-4126-9bdd-526761a8a156]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:26:34 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:26:34.479 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[e6eff9e6-1bd1-4ed5-8caa-b6f4ed928fec]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0a07ba3d-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:7d:c8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 33], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 500075, 'reachable_time': 28912, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217371, 'error': None, 'target': 'ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:26:34 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:26:34.501 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[ea62dd0e-a9ac-4c53-8c40-a7b95580f1d3]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap0a07ba3d-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 500085, 'tstamp': 500085}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217372, 'error': None, 'target': 'ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap0a07ba3d-41'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 500087, 'tstamp': 500087}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217372, 'error': None, 'target': 'ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:26:34 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:26:34.502 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0a07ba3d-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:26:34 compute-0 nova_compute[189265]: 2025-09-30 07:26:34.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:34 compute-0 nova_compute[189265]: 2025-09-30 07:26:34.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:34 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:26:34.509 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0a07ba3d-40, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:26:34 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:26:34.509 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:26:34 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:26:34.510 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0a07ba3d-40, col_values=(('external_ids', {'iface-id': '8b5421e3-6f92-4b98-bc3c-4670813d915c'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:26:34 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:26:34.510 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:26:34 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:26:34.511 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[66b3d720-dd60-4c4b-b929-3f9632af88f0]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-0a07ba3d-468f-4279-9be2-b3ef141df6a7\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/0a07ba3d-468f-4279-9be2-b3ef141df6a7.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID 0a07ba3d-468f-4279-9be2-b3ef141df6a7\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:26:34 compute-0 nova_compute[189265]: 2025-09-30 07:26:34.518 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:34 compute-0 nova_compute[189265]: 2025-09-30 07:26:34.527 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:34 compute-0 nova_compute[189265]: 2025-09-30 07:26:34.571 2 INFO nova.virt.libvirt.driver [-] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Instance destroyed successfully.
Sep 30 07:26:34 compute-0 nova_compute[189265]: 2025-09-30 07:26:34.572 2 DEBUG nova.objects.instance [None req-10c89dce-1098-4260-8753-e405620065b2 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lazy-loading 'resources' on Instance uuid de990aa4-7e40-416a-8e7b-ebf90847bb68 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:26:34 compute-0 nova_compute[189265]: 2025-09-30 07:26:34.599 2 DEBUG nova.compute.manager [req-beb2f42d-f97d-4652-a506-c0552c456872 req-a712e909-8b34-4038-9b66-6d5918c562bd 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Received event network-vif-unplugged-a937ddac-5b07-4fc5-8b58-8cc93dee8cac external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:26:34 compute-0 nova_compute[189265]: 2025-09-30 07:26:34.599 2 DEBUG oslo_concurrency.lockutils [req-beb2f42d-f97d-4652-a506-c0552c456872 req-a712e909-8b34-4038-9b66-6d5918c562bd 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "de990aa4-7e40-416a-8e7b-ebf90847bb68-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:26:34 compute-0 nova_compute[189265]: 2025-09-30 07:26:34.599 2 DEBUG oslo_concurrency.lockutils [req-beb2f42d-f97d-4652-a506-c0552c456872 req-a712e909-8b34-4038-9b66-6d5918c562bd 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "de990aa4-7e40-416a-8e7b-ebf90847bb68-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:26:34 compute-0 nova_compute[189265]: 2025-09-30 07:26:34.599 2 DEBUG oslo_concurrency.lockutils [req-beb2f42d-f97d-4652-a506-c0552c456872 req-a712e909-8b34-4038-9b66-6d5918c562bd 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "de990aa4-7e40-416a-8e7b-ebf90847bb68-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:26:34 compute-0 nova_compute[189265]: 2025-09-30 07:26:34.600 2 DEBUG nova.compute.manager [req-beb2f42d-f97d-4652-a506-c0552c456872 req-a712e909-8b34-4038-9b66-6d5918c562bd 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] No waiting events found dispatching network-vif-unplugged-a937ddac-5b07-4fc5-8b58-8cc93dee8cac pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:26:34 compute-0 nova_compute[189265]: 2025-09-30 07:26:34.600 2 DEBUG nova.compute.manager [req-beb2f42d-f97d-4652-a506-c0552c456872 req-a712e909-8b34-4038-9b66-6d5918c562bd 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Received event network-vif-unplugged-a937ddac-5b07-4fc5-8b58-8cc93dee8cac for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:26:35 compute-0 nova_compute[189265]: 2025-09-30 07:26:35.080 2 DEBUG nova.virt.libvirt.vif [None req-10c89dce-1098-4260-8753-e405620065b2 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-09-30T07:25:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1592299149',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1592299149',id=13,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T07:25:44Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2ad7bd988b6047509c2c19eb4e0dc32c',ramdisk_id='',reservation_id='r-2fx9o6oa',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-385408215',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-385408215-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T07:25:44Z,user_data=None,user_id='071bf5838f2f473a865873b6f7846f84',uuid=de990aa4-7e40-416a-8e7b-ebf90847bb68,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a937ddac-5b07-4fc5-8b58-8cc93dee8cac", "address": "fa:16:3e:f7:c8:c7", "network": {"id": "0a07ba3d-468f-4279-9be2-b3ef141df6a7", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-465825729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02a4831cb362481d98b354ed3bf2d113", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa937ddac-5b", "ovs_interfaceid": "a937ddac-5b07-4fc5-8b58-8cc93dee8cac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 07:26:35 compute-0 nova_compute[189265]: 2025-09-30 07:26:35.081 2 DEBUG nova.network.os_vif_util [None req-10c89dce-1098-4260-8753-e405620065b2 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Converting VIF {"id": "a937ddac-5b07-4fc5-8b58-8cc93dee8cac", "address": "fa:16:3e:f7:c8:c7", "network": {"id": "0a07ba3d-468f-4279-9be2-b3ef141df6a7", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-465825729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02a4831cb362481d98b354ed3bf2d113", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa937ddac-5b", "ovs_interfaceid": "a937ddac-5b07-4fc5-8b58-8cc93dee8cac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:26:35 compute-0 nova_compute[189265]: 2025-09-30 07:26:35.082 2 DEBUG nova.network.os_vif_util [None req-10c89dce-1098-4260-8753-e405620065b2 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f7:c8:c7,bridge_name='br-int',has_traffic_filtering=True,id=a937ddac-5b07-4fc5-8b58-8cc93dee8cac,network=Network(0a07ba3d-468f-4279-9be2-b3ef141df6a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa937ddac-5b') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:26:35 compute-0 nova_compute[189265]: 2025-09-30 07:26:35.082 2 DEBUG os_vif [None req-10c89dce-1098-4260-8753-e405620065b2 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f7:c8:c7,bridge_name='br-int',has_traffic_filtering=True,id=a937ddac-5b07-4fc5-8b58-8cc93dee8cac,network=Network(0a07ba3d-468f-4279-9be2-b3ef141df6a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa937ddac-5b') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 07:26:35 compute-0 nova_compute[189265]: 2025-09-30 07:26:35.086 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:35 compute-0 nova_compute[189265]: 2025-09-30 07:26:35.087 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa937ddac-5b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:26:35 compute-0 nova_compute[189265]: 2025-09-30 07:26:35.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:35 compute-0 nova_compute[189265]: 2025-09-30 07:26:35.091 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:35 compute-0 nova_compute[189265]: 2025-09-30 07:26:35.092 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:35 compute-0 nova_compute[189265]: 2025-09-30 07:26:35.093 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=f2d9a9fb-b44c-448c-ab14-8d084cb25d44) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:26:35 compute-0 nova_compute[189265]: 2025-09-30 07:26:35.094 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:35 compute-0 nova_compute[189265]: 2025-09-30 07:26:35.096 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:26:35 compute-0 nova_compute[189265]: 2025-09-30 07:26:35.099 2 INFO os_vif [None req-10c89dce-1098-4260-8753-e405620065b2 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f7:c8:c7,bridge_name='br-int',has_traffic_filtering=True,id=a937ddac-5b07-4fc5-8b58-8cc93dee8cac,network=Network(0a07ba3d-468f-4279-9be2-b3ef141df6a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa937ddac-5b')
Sep 30 07:26:35 compute-0 nova_compute[189265]: 2025-09-30 07:26:35.100 2 INFO nova.virt.libvirt.driver [None req-10c89dce-1098-4260-8753-e405620065b2 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Deleting instance files /var/lib/nova/instances/de990aa4-7e40-416a-8e7b-ebf90847bb68_del
Sep 30 07:26:35 compute-0 nova_compute[189265]: 2025-09-30 07:26:35.101 2 INFO nova.virt.libvirt.driver [None req-10c89dce-1098-4260-8753-e405620065b2 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Deletion of /var/lib/nova/instances/de990aa4-7e40-416a-8e7b-ebf90847bb68_del complete
Sep 30 07:26:35 compute-0 nova_compute[189265]: 2025-09-30 07:26:35.616 2 INFO nova.compute.manager [None req-10c89dce-1098-4260-8753-e405620065b2 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Took 1.32 seconds to destroy the instance on the hypervisor.
Sep 30 07:26:35 compute-0 nova_compute[189265]: 2025-09-30 07:26:35.616 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-10c89dce-1098-4260-8753-e405620065b2 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 07:26:35 compute-0 nova_compute[189265]: 2025-09-30 07:26:35.617 2 DEBUG nova.compute.manager [-] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 07:26:35 compute-0 nova_compute[189265]: 2025-09-30 07:26:35.617 2 DEBUG nova.network.neutron [-] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 07:26:35 compute-0 nova_compute[189265]: 2025-09-30 07:26:35.618 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:26:36 compute-0 podman[217390]: 2025-09-30 07:26:36.478228053 +0000 UTC m=+0.062998132 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250930, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, org.label-schema.license=GPLv2, tcib_managed=true)
Sep 30 07:26:36 compute-0 podman[217391]: 2025-09-30 07:26:36.483909084 +0000 UTC m=+0.064597227 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 07:26:36 compute-0 podman[217392]: 2025-09-30 07:26:36.504102508 +0000 UTC m=+0.086201441 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Sep 30 07:26:36 compute-0 nova_compute[189265]: 2025-09-30 07:26:36.552 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:26:36 compute-0 nova_compute[189265]: 2025-09-30 07:26:36.664 2 DEBUG nova.compute.manager [req-57edc9d6-20c3-474e-8ae6-b65c556563ed req-332a3e89-0625-4437-8dde-3e7adc7685c2 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Received event network-vif-unplugged-a937ddac-5b07-4fc5-8b58-8cc93dee8cac external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:26:36 compute-0 nova_compute[189265]: 2025-09-30 07:26:36.664 2 DEBUG oslo_concurrency.lockutils [req-57edc9d6-20c3-474e-8ae6-b65c556563ed req-332a3e89-0625-4437-8dde-3e7adc7685c2 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "de990aa4-7e40-416a-8e7b-ebf90847bb68-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:26:36 compute-0 nova_compute[189265]: 2025-09-30 07:26:36.665 2 DEBUG oslo_concurrency.lockutils [req-57edc9d6-20c3-474e-8ae6-b65c556563ed req-332a3e89-0625-4437-8dde-3e7adc7685c2 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "de990aa4-7e40-416a-8e7b-ebf90847bb68-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:26:36 compute-0 nova_compute[189265]: 2025-09-30 07:26:36.665 2 DEBUG oslo_concurrency.lockutils [req-57edc9d6-20c3-474e-8ae6-b65c556563ed req-332a3e89-0625-4437-8dde-3e7adc7685c2 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "de990aa4-7e40-416a-8e7b-ebf90847bb68-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:26:36 compute-0 nova_compute[189265]: 2025-09-30 07:26:36.665 2 DEBUG nova.compute.manager [req-57edc9d6-20c3-474e-8ae6-b65c556563ed req-332a3e89-0625-4437-8dde-3e7adc7685c2 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] No waiting events found dispatching network-vif-unplugged-a937ddac-5b07-4fc5-8b58-8cc93dee8cac pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:26:36 compute-0 nova_compute[189265]: 2025-09-30 07:26:36.666 2 DEBUG nova.compute.manager [req-57edc9d6-20c3-474e-8ae6-b65c556563ed req-332a3e89-0625-4437-8dde-3e7adc7685c2 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Received event network-vif-unplugged-a937ddac-5b07-4fc5-8b58-8cc93dee8cac for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:26:36 compute-0 nova_compute[189265]: 2025-09-30 07:26:36.783 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:26:37 compute-0 nova_compute[189265]: 2025-09-30 07:26:37.590 2 DEBUG nova.compute.manager [req-b8f23339-c7c1-4495-aee2-c4268faf41f5 req-3378d990-9d0a-455b-86a6-e1f0bd902845 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Received event network-vif-deleted-a937ddac-5b07-4fc5-8b58-8cc93dee8cac external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:26:37 compute-0 nova_compute[189265]: 2025-09-30 07:26:37.590 2 INFO nova.compute.manager [req-b8f23339-c7c1-4495-aee2-c4268faf41f5 req-3378d990-9d0a-455b-86a6-e1f0bd902845 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Neutron deleted interface a937ddac-5b07-4fc5-8b58-8cc93dee8cac; detaching it from the instance and deleting it from the info cache
Sep 30 07:26:37 compute-0 nova_compute[189265]: 2025-09-30 07:26:37.591 2 DEBUG nova.network.neutron [req-b8f23339-c7c1-4495-aee2-c4268faf41f5 req-3378d990-9d0a-455b-86a6-e1f0bd902845 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:26:37 compute-0 nova_compute[189265]: 2025-09-30 07:26:37.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:38 compute-0 nova_compute[189265]: 2025-09-30 07:26:38.043 2 DEBUG nova.network.neutron [-] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:26:38 compute-0 nova_compute[189265]: 2025-09-30 07:26:38.098 2 DEBUG nova.compute.manager [req-b8f23339-c7c1-4495-aee2-c4268faf41f5 req-3378d990-9d0a-455b-86a6-e1f0bd902845 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Detach interface failed, port_id=a937ddac-5b07-4fc5-8b58-8cc93dee8cac, reason: Instance de990aa4-7e40-416a-8e7b-ebf90847bb68 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Sep 30 07:26:38 compute-0 nova_compute[189265]: 2025-09-30 07:26:38.551 2 INFO nova.compute.manager [-] [instance: de990aa4-7e40-416a-8e7b-ebf90847bb68] Took 2.93 seconds to deallocate network for instance.
Sep 30 07:26:38 compute-0 nova_compute[189265]: 2025-09-30 07:26:38.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:26:39 compute-0 nova_compute[189265]: 2025-09-30 07:26:39.095 2 DEBUG oslo_concurrency.lockutils [None req-10c89dce-1098-4260-8753-e405620065b2 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:26:39 compute-0 nova_compute[189265]: 2025-09-30 07:26:39.095 2 DEBUG oslo_concurrency.lockutils [None req-10c89dce-1098-4260-8753-e405620065b2 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:26:39 compute-0 nova_compute[189265]: 2025-09-30 07:26:39.163 2 DEBUG nova.compute.provider_tree [None req-10c89dce-1098-4260-8753-e405620065b2 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:26:39 compute-0 nova_compute[189265]: 2025-09-30 07:26:39.677 2 DEBUG nova.scheduler.client.report [None req-10c89dce-1098-4260-8753-e405620065b2 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:26:40 compute-0 nova_compute[189265]: 2025-09-30 07:26:40.095 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:40 compute-0 nova_compute[189265]: 2025-09-30 07:26:40.214 2 DEBUG oslo_concurrency.lockutils [None req-10c89dce-1098-4260-8753-e405620065b2 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.119s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:26:40 compute-0 nova_compute[189265]: 2025-09-30 07:26:40.244 2 INFO nova.scheduler.client.report [None req-10c89dce-1098-4260-8753-e405620065b2 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Deleted allocations for instance de990aa4-7e40-416a-8e7b-ebf90847bb68
Sep 30 07:26:40 compute-0 nova_compute[189265]: 2025-09-30 07:26:40.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:26:41 compute-0 nova_compute[189265]: 2025-09-30 07:26:41.270 2 DEBUG oslo_concurrency.lockutils [None req-10c89dce-1098-4260-8753-e405620065b2 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lock "de990aa4-7e40-416a-8e7b-ebf90847bb68" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.513s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:26:41 compute-0 nova_compute[189265]: 2025-09-30 07:26:41.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:26:41 compute-0 nova_compute[189265]: 2025-09-30 07:26:41.787 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:26:42 compute-0 nova_compute[189265]: 2025-09-30 07:26:42.044 2 DEBUG oslo_concurrency.lockutils [None req-7c7f3e2e-121a-41f5-b8a6-9fae09ea2d54 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Acquiring lock "ba50db09-103a-463d-9b29-917488cc4974" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:26:42 compute-0 nova_compute[189265]: 2025-09-30 07:26:42.044 2 DEBUG oslo_concurrency.lockutils [None req-7c7f3e2e-121a-41f5-b8a6-9fae09ea2d54 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lock "ba50db09-103a-463d-9b29-917488cc4974" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:26:42 compute-0 nova_compute[189265]: 2025-09-30 07:26:42.045 2 DEBUG oslo_concurrency.lockutils [None req-7c7f3e2e-121a-41f5-b8a6-9fae09ea2d54 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Acquiring lock "ba50db09-103a-463d-9b29-917488cc4974-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:26:42 compute-0 nova_compute[189265]: 2025-09-30 07:26:42.045 2 DEBUG oslo_concurrency.lockutils [None req-7c7f3e2e-121a-41f5-b8a6-9fae09ea2d54 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lock "ba50db09-103a-463d-9b29-917488cc4974-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:26:42 compute-0 nova_compute[189265]: 2025-09-30 07:26:42.046 2 DEBUG oslo_concurrency.lockutils [None req-7c7f3e2e-121a-41f5-b8a6-9fae09ea2d54 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lock "ba50db09-103a-463d-9b29-917488cc4974-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:26:42 compute-0 nova_compute[189265]: 2025-09-30 07:26:42.070 2 INFO nova.compute.manager [None req-7c7f3e2e-121a-41f5-b8a6-9fae09ea2d54 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: ba50db09-103a-463d-9b29-917488cc4974] Terminating instance
Sep 30 07:26:42 compute-0 nova_compute[189265]: 2025-09-30 07:26:42.595 2 DEBUG nova.compute.manager [None req-7c7f3e2e-121a-41f5-b8a6-9fae09ea2d54 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: ba50db09-103a-463d-9b29-917488cc4974] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 07:26:42 compute-0 kernel: tap51b727ab-5f (unregistering): left promiscuous mode
Sep 30 07:26:42 compute-0 NetworkManager[51813]: <info>  [1759217202.6189] device (tap51b727ab-5f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 07:26:42 compute-0 ovn_controller[91436]: 2025-09-30T07:26:42Z|00147|binding|INFO|Releasing lport 51b727ab-5fda-4e00-aef4-af6f5d5601f9 from this chassis (sb_readonly=0)
Sep 30 07:26:42 compute-0 ovn_controller[91436]: 2025-09-30T07:26:42Z|00148|binding|INFO|Setting lport 51b727ab-5fda-4e00-aef4-af6f5d5601f9 down in Southbound
Sep 30 07:26:42 compute-0 ovn_controller[91436]: 2025-09-30T07:26:42Z|00149|binding|INFO|Removing iface tap51b727ab-5f ovn-installed in OVS
Sep 30 07:26:42 compute-0 nova_compute[189265]: 2025-09-30 07:26:42.628 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:42 compute-0 nova_compute[189265]: 2025-09-30 07:26:42.631 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:26:42.633 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:a9:8c 10.100.0.6'], port_security=['fa:16:3e:1e:a9:8c 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'ba50db09-103a-463d-9b29-917488cc4974', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0a07ba3d-468f-4279-9be2-b3ef141df6a7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2ad7bd988b6047509c2c19eb4e0dc32c', 'neutron:revision_number': '15', 'neutron:security_group_ids': 'dc82e88d-abda-4feb-bd34-afbed64798c7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ea21a402-508c-472e-bd89-e4a2e8cde5bb, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], logical_port=51b727ab-5fda-4e00-aef4-af6f5d5601f9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:26:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:26:42.633 100322 INFO neutron.agent.ovn.metadata.agent [-] Port 51b727ab-5fda-4e00-aef4-af6f5d5601f9 in datapath 0a07ba3d-468f-4279-9be2-b3ef141df6a7 unbound from our chassis
Sep 30 07:26:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:26:42.635 100322 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0a07ba3d-468f-4279-9be2-b3ef141df6a7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 07:26:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:26:42.636 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[968dc312-0a9c-4e5f-9a06-02844159b7c8]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:26:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:26:42.637 100322 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7 namespace which is not needed anymore
Sep 30 07:26:42 compute-0 nova_compute[189265]: 2025-09-30 07:26:42.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:42 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Sep 30 07:26:42 compute-0 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000000c.scope: Consumed 2.515s CPU time.
Sep 30 07:26:42 compute-0 systemd-machined[149233]: Machine qemu-10-instance-0000000c terminated.
Sep 30 07:26:42 compute-0 neutron-haproxy-ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7[217058]: [NOTICE]   (217062) : haproxy version is 3.0.5-8e879a5
Sep 30 07:26:42 compute-0 neutron-haproxy-ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7[217058]: [NOTICE]   (217062) : path to executable is /usr/sbin/haproxy
Sep 30 07:26:42 compute-0 neutron-haproxy-ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7[217058]: [WARNING]  (217062) : Exiting Master process...
Sep 30 07:26:42 compute-0 podman[217480]: 2025-09-30 07:26:42.772296885 +0000 UTC m=+0.038458884 container kill 02fa1edbadd5e66dcae84b82abf37acd38b948cd9f7ec1bf91b6e568dfdd9621 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Sep 30 07:26:42 compute-0 neutron-haproxy-ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7[217058]: [ALERT]    (217062) : Current worker (217064) exited with code 143 (Terminated)
Sep 30 07:26:42 compute-0 neutron-haproxy-ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7[217058]: [WARNING]  (217062) : All workers exited. Exiting... (0)
Sep 30 07:26:42 compute-0 systemd[1]: libpod-02fa1edbadd5e66dcae84b82abf37acd38b948cd9f7ec1bf91b6e568dfdd9621.scope: Deactivated successfully.
Sep 30 07:26:42 compute-0 podman[217495]: 2025-09-30 07:26:42.833751912 +0000 UTC m=+0.036950341 container died 02fa1edbadd5e66dcae84b82abf37acd38b948cd9f7ec1bf91b6e568dfdd9621 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest)
Sep 30 07:26:42 compute-0 nova_compute[189265]: 2025-09-30 07:26:42.842 2 DEBUG nova.compute.manager [req-45fa8650-9041-48c4-884a-41b501c0718a req-263456fe-eda9-43f9-ab2b-c3d9fd34d073 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: ba50db09-103a-463d-9b29-917488cc4974] Received event network-vif-unplugged-51b727ab-5fda-4e00-aef4-af6f5d5601f9 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:26:42 compute-0 nova_compute[189265]: 2025-09-30 07:26:42.844 2 DEBUG oslo_concurrency.lockutils [req-45fa8650-9041-48c4-884a-41b501c0718a req-263456fe-eda9-43f9-ab2b-c3d9fd34d073 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "ba50db09-103a-463d-9b29-917488cc4974-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:26:42 compute-0 nova_compute[189265]: 2025-09-30 07:26:42.844 2 DEBUG oslo_concurrency.lockutils [req-45fa8650-9041-48c4-884a-41b501c0718a req-263456fe-eda9-43f9-ab2b-c3d9fd34d073 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "ba50db09-103a-463d-9b29-917488cc4974-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:26:42 compute-0 nova_compute[189265]: 2025-09-30 07:26:42.845 2 DEBUG oslo_concurrency.lockutils [req-45fa8650-9041-48c4-884a-41b501c0718a req-263456fe-eda9-43f9-ab2b-c3d9fd34d073 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "ba50db09-103a-463d-9b29-917488cc4974-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:26:42 compute-0 nova_compute[189265]: 2025-09-30 07:26:42.845 2 DEBUG nova.compute.manager [req-45fa8650-9041-48c4-884a-41b501c0718a req-263456fe-eda9-43f9-ab2b-c3d9fd34d073 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: ba50db09-103a-463d-9b29-917488cc4974] No waiting events found dispatching network-vif-unplugged-51b727ab-5fda-4e00-aef4-af6f5d5601f9 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:26:42 compute-0 nova_compute[189265]: 2025-09-30 07:26:42.845 2 DEBUG nova.compute.manager [req-45fa8650-9041-48c4-884a-41b501c0718a req-263456fe-eda9-43f9-ab2b-c3d9fd34d073 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: ba50db09-103a-463d-9b29-917488cc4974] Received event network-vif-unplugged-51b727ab-5fda-4e00-aef4-af6f5d5601f9 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:26:42 compute-0 nova_compute[189265]: 2025-09-30 07:26:42.857 2 INFO nova.virt.libvirt.driver [-] [instance: ba50db09-103a-463d-9b29-917488cc4974] Instance destroyed successfully.
Sep 30 07:26:42 compute-0 nova_compute[189265]: 2025-09-30 07:26:42.858 2 DEBUG nova.objects.instance [None req-7c7f3e2e-121a-41f5-b8a6-9fae09ea2d54 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lazy-loading 'resources' on Instance uuid ba50db09-103a-463d-9b29-917488cc4974 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:26:42 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-02fa1edbadd5e66dcae84b82abf37acd38b948cd9f7ec1bf91b6e568dfdd9621-userdata-shm.mount: Deactivated successfully.
Sep 30 07:26:42 compute-0 systemd[1]: var-lib-containers-storage-overlay-a4b884bf6ce7bff06ed9e8e4a936a76b49c90a435c925d242313a21f97f3dc53-merged.mount: Deactivated successfully.
Sep 30 07:26:42 compute-0 podman[217495]: 2025-09-30 07:26:42.883121035 +0000 UTC m=+0.086319464 container cleanup 02fa1edbadd5e66dcae84b82abf37acd38b948cd9f7ec1bf91b6e568dfdd9621 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Sep 30 07:26:42 compute-0 systemd[1]: libpod-conmon-02fa1edbadd5e66dcae84b82abf37acd38b948cd9f7ec1bf91b6e568dfdd9621.scope: Deactivated successfully.
Sep 30 07:26:42 compute-0 podman[217497]: 2025-09-30 07:26:42.899544332 +0000 UTC m=+0.096174505 container remove 02fa1edbadd5e66dcae84b82abf37acd38b948cd9f7ec1bf91b6e568dfdd9621 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Sep 30 07:26:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:26:42.908 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[8b65a2ce-242b-4b3c-86d5-d0c9cf3fec49]: (4, ("Tue Sep 30 07:26:42 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7 (02fa1edbadd5e66dcae84b82abf37acd38b948cd9f7ec1bf91b6e568dfdd9621)\n02fa1edbadd5e66dcae84b82abf37acd38b948cd9f7ec1bf91b6e568dfdd9621\nTue Sep 30 07:26:42 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7 (02fa1edbadd5e66dcae84b82abf37acd38b948cd9f7ec1bf91b6e568dfdd9621)\n02fa1edbadd5e66dcae84b82abf37acd38b948cd9f7ec1bf91b6e568dfdd9621\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:26:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:26:42.909 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[7158ff93-7e82-44ce-8487-35eba0766cf1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:26:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:26:42.909 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0a07ba3d-468f-4279-9be2-b3ef141df6a7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0a07ba3d-468f-4279-9be2-b3ef141df6a7.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:26:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:26:42.909 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[f6467601-bf73-4715-a2d2-af381b6f756f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:26:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:26:42.910 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0a07ba3d-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:26:42 compute-0 nova_compute[189265]: 2025-09-30 07:26:42.955 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:42 compute-0 kernel: tap0a07ba3d-40: left promiscuous mode
Sep 30 07:26:42 compute-0 nova_compute[189265]: 2025-09-30 07:26:42.971 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:26:42.972 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[7a6fc75e-bd26-4be8-a619-35daf5953e6a]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:26:42 compute-0 nova_compute[189265]: 2025-09-30 07:26:42.990 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:43 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:26:43.013 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[f7fd43a5-3494-4b68-86d1-aa9b139b54b2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:26:43 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:26:43.014 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[b329e5c8-a14c-488d-89fd-6fc5f271e300]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:26:43 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:26:43.027 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[5808a43d-f45b-42e2-aadb-a8d9f69f2a93]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 500070, 'reachable_time': 21228, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217543, 'error': None, 'target': 'ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:26:43 compute-0 systemd[1]: run-netns-ovnmeta\x2d0a07ba3d\x2d468f\x2d4279\x2d9be2\x2db3ef141df6a7.mount: Deactivated successfully.
Sep 30 07:26:43 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:26:43.031 100440 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0a07ba3d-468f-4279-9be2-b3ef141df6a7 deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Sep 30 07:26:43 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:26:43.031 100440 DEBUG oslo.privsep.daemon [-] privsep: reply[6a7cf324-6edb-4936-b56f-23160c345f26]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:26:43 compute-0 nova_compute[189265]: 2025-09-30 07:26:43.367 2 DEBUG nova.virt.libvirt.vif [None req-7c7f3e2e-121a-41f5-b8a6-9fae09ea2d54 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-09-30T07:25:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteHostMaintenanceStrategy-server-1193476648',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutehostmaintenancestrategy-server-1193476648',id=12,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T07:25:22Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2ad7bd988b6047509c2c19eb4e0dc32c',ramdisk_id='',reservation_id='r-7f0wnzmz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',clean_attempts='1',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteHostMaintenanceStrategy-385408215',owner_user_name='tempest-TestExecuteHostMaintenanceStrategy-385408215-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T07:26:27Z,user_data=None,user_id='071bf5838f2f473a865873b6f7846f84',uuid=ba50db09-103a-463d-9b29-917488cc4974,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "51b727ab-5fda-4e00-aef4-af6f5d5601f9", "address": "fa:16:3e:1e:a9:8c", "network": {"id": "0a07ba3d-468f-4279-9be2-b3ef141df6a7", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-465825729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02a4831cb362481d98b354ed3bf2d113", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51b727ab-5f", "ovs_interfaceid": "51b727ab-5fda-4e00-aef4-af6f5d5601f9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 07:26:43 compute-0 nova_compute[189265]: 2025-09-30 07:26:43.368 2 DEBUG nova.network.os_vif_util [None req-7c7f3e2e-121a-41f5-b8a6-9fae09ea2d54 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Converting VIF {"id": "51b727ab-5fda-4e00-aef4-af6f5d5601f9", "address": "fa:16:3e:1e:a9:8c", "network": {"id": "0a07ba3d-468f-4279-9be2-b3ef141df6a7", "bridge": "br-int", "label": "tempest-TestExecuteHostMaintenanceStrategy-465825729-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "02a4831cb362481d98b354ed3bf2d113", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap51b727ab-5f", "ovs_interfaceid": "51b727ab-5fda-4e00-aef4-af6f5d5601f9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:26:43 compute-0 nova_compute[189265]: 2025-09-30 07:26:43.368 2 DEBUG nova.network.os_vif_util [None req-7c7f3e2e-121a-41f5-b8a6-9fae09ea2d54 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1e:a9:8c,bridge_name='br-int',has_traffic_filtering=True,id=51b727ab-5fda-4e00-aef4-af6f5d5601f9,network=Network(0a07ba3d-468f-4279-9be2-b3ef141df6a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap51b727ab-5f') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:26:43 compute-0 nova_compute[189265]: 2025-09-30 07:26:43.368 2 DEBUG os_vif [None req-7c7f3e2e-121a-41f5-b8a6-9fae09ea2d54 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1e:a9:8c,bridge_name='br-int',has_traffic_filtering=True,id=51b727ab-5fda-4e00-aef4-af6f5d5601f9,network=Network(0a07ba3d-468f-4279-9be2-b3ef141df6a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap51b727ab-5f') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 07:26:43 compute-0 nova_compute[189265]: 2025-09-30 07:26:43.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:43 compute-0 nova_compute[189265]: 2025-09-30 07:26:43.370 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap51b727ab-5f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:26:43 compute-0 nova_compute[189265]: 2025-09-30 07:26:43.371 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:43 compute-0 nova_compute[189265]: 2025-09-30 07:26:43.373 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:26:43 compute-0 nova_compute[189265]: 2025-09-30 07:26:43.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:43 compute-0 nova_compute[189265]: 2025-09-30 07:26:43.374 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=b7a501ee-ca91-4195-b53c-ea7c04b42d43) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:26:43 compute-0 nova_compute[189265]: 2025-09-30 07:26:43.375 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:43 compute-0 nova_compute[189265]: 2025-09-30 07:26:43.375 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:43 compute-0 nova_compute[189265]: 2025-09-30 07:26:43.377 2 INFO os_vif [None req-7c7f3e2e-121a-41f5-b8a6-9fae09ea2d54 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1e:a9:8c,bridge_name='br-int',has_traffic_filtering=True,id=51b727ab-5fda-4e00-aef4-af6f5d5601f9,network=Network(0a07ba3d-468f-4279-9be2-b3ef141df6a7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap51b727ab-5f')
Sep 30 07:26:43 compute-0 nova_compute[189265]: 2025-09-30 07:26:43.377 2 INFO nova.virt.libvirt.driver [None req-7c7f3e2e-121a-41f5-b8a6-9fae09ea2d54 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: ba50db09-103a-463d-9b29-917488cc4974] Deleting instance files /var/lib/nova/instances/ba50db09-103a-463d-9b29-917488cc4974_del
Sep 30 07:26:43 compute-0 nova_compute[189265]: 2025-09-30 07:26:43.378 2 INFO nova.virt.libvirt.driver [None req-7c7f3e2e-121a-41f5-b8a6-9fae09ea2d54 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: ba50db09-103a-463d-9b29-917488cc4974] Deletion of /var/lib/nova/instances/ba50db09-103a-463d-9b29-917488cc4974_del complete
Sep 30 07:26:43 compute-0 nova_compute[189265]: 2025-09-30 07:26:43.889 2 INFO nova.compute.manager [None req-7c7f3e2e-121a-41f5-b8a6-9fae09ea2d54 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] [instance: ba50db09-103a-463d-9b29-917488cc4974] Took 1.29 seconds to destroy the instance on the hypervisor.
Sep 30 07:26:43 compute-0 nova_compute[189265]: 2025-09-30 07:26:43.890 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-7c7f3e2e-121a-41f5-b8a6-9fae09ea2d54 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 07:26:43 compute-0 nova_compute[189265]: 2025-09-30 07:26:43.890 2 DEBUG nova.compute.manager [-] [instance: ba50db09-103a-463d-9b29-917488cc4974] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 07:26:43 compute-0 nova_compute[189265]: 2025-09-30 07:26:43.890 2 DEBUG nova.network.neutron [-] [instance: ba50db09-103a-463d-9b29-917488cc4974] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 07:26:43 compute-0 nova_compute[189265]: 2025-09-30 07:26:43.890 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:26:44 compute-0 nova_compute[189265]: 2025-09-30 07:26:44.431 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:26:44 compute-0 nova_compute[189265]: 2025-09-30 07:26:44.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:26:44 compute-0 nova_compute[189265]: 2025-09-30 07:26:44.906 2 DEBUG nova.compute.manager [req-312dde78-fb0c-4e0c-b574-511abd4f19b7 req-49be14f1-50a5-4521-bb30-692119f577ac 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: ba50db09-103a-463d-9b29-917488cc4974] Received event network-vif-unplugged-51b727ab-5fda-4e00-aef4-af6f5d5601f9 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:26:44 compute-0 nova_compute[189265]: 2025-09-30 07:26:44.906 2 DEBUG oslo_concurrency.lockutils [req-312dde78-fb0c-4e0c-b574-511abd4f19b7 req-49be14f1-50a5-4521-bb30-692119f577ac 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "ba50db09-103a-463d-9b29-917488cc4974-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:26:44 compute-0 nova_compute[189265]: 2025-09-30 07:26:44.906 2 DEBUG oslo_concurrency.lockutils [req-312dde78-fb0c-4e0c-b574-511abd4f19b7 req-49be14f1-50a5-4521-bb30-692119f577ac 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "ba50db09-103a-463d-9b29-917488cc4974-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:26:44 compute-0 nova_compute[189265]: 2025-09-30 07:26:44.906 2 DEBUG oslo_concurrency.lockutils [req-312dde78-fb0c-4e0c-b574-511abd4f19b7 req-49be14f1-50a5-4521-bb30-692119f577ac 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "ba50db09-103a-463d-9b29-917488cc4974-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:26:44 compute-0 nova_compute[189265]: 2025-09-30 07:26:44.906 2 DEBUG nova.compute.manager [req-312dde78-fb0c-4e0c-b574-511abd4f19b7 req-49be14f1-50a5-4521-bb30-692119f577ac 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: ba50db09-103a-463d-9b29-917488cc4974] No waiting events found dispatching network-vif-unplugged-51b727ab-5fda-4e00-aef4-af6f5d5601f9 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:26:44 compute-0 nova_compute[189265]: 2025-09-30 07:26:44.907 2 DEBUG nova.compute.manager [req-312dde78-fb0c-4e0c-b574-511abd4f19b7 req-49be14f1-50a5-4521-bb30-692119f577ac 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: ba50db09-103a-463d-9b29-917488cc4974] Received event network-vif-unplugged-51b727ab-5fda-4e00-aef4-af6f5d5601f9 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:26:44 compute-0 nova_compute[189265]: 2025-09-30 07:26:44.907 2 DEBUG nova.compute.manager [req-312dde78-fb0c-4e0c-b574-511abd4f19b7 req-49be14f1-50a5-4521-bb30-692119f577ac 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: ba50db09-103a-463d-9b29-917488cc4974] Received event network-vif-deleted-51b727ab-5fda-4e00-aef4-af6f5d5601f9 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:26:44 compute-0 nova_compute[189265]: 2025-09-30 07:26:44.907 2 INFO nova.compute.manager [req-312dde78-fb0c-4e0c-b574-511abd4f19b7 req-49be14f1-50a5-4521-bb30-692119f577ac 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: ba50db09-103a-463d-9b29-917488cc4974] Neutron deleted interface 51b727ab-5fda-4e00-aef4-af6f5d5601f9; detaching it from the instance and deleting it from the info cache
Sep 30 07:26:44 compute-0 nova_compute[189265]: 2025-09-30 07:26:44.907 2 DEBUG nova.network.neutron [req-312dde78-fb0c-4e0c-b574-511abd4f19b7 req-49be14f1-50a5-4521-bb30-692119f577ac 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: ba50db09-103a-463d-9b29-917488cc4974] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:26:45 compute-0 nova_compute[189265]: 2025-09-30 07:26:45.253 2 DEBUG nova.network.neutron [-] [instance: ba50db09-103a-463d-9b29-917488cc4974] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:26:45 compute-0 nova_compute[189265]: 2025-09-30 07:26:45.297 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:26:45 compute-0 nova_compute[189265]: 2025-09-30 07:26:45.298 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:26:45 compute-0 nova_compute[189265]: 2025-09-30 07:26:45.298 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:26:45 compute-0 nova_compute[189265]: 2025-09-30 07:26:45.298 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:26:45 compute-0 nova_compute[189265]: 2025-09-30 07:26:45.413 2 DEBUG nova.compute.manager [req-312dde78-fb0c-4e0c-b574-511abd4f19b7 req-49be14f1-50a5-4521-bb30-692119f577ac 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: ba50db09-103a-463d-9b29-917488cc4974] Detach interface failed, port_id=51b727ab-5fda-4e00-aef4-af6f5d5601f9, reason: Instance ba50db09-103a-463d-9b29-917488cc4974 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Sep 30 07:26:45 compute-0 nova_compute[189265]: 2025-09-30 07:26:45.441 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:26:45 compute-0 nova_compute[189265]: 2025-09-30 07:26:45.442 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:26:45 compute-0 nova_compute[189265]: 2025-09-30 07:26:45.478 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.036s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:26:45 compute-0 nova_compute[189265]: 2025-09-30 07:26:45.479 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5855MB free_disk=73.3039665222168GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:26:45 compute-0 nova_compute[189265]: 2025-09-30 07:26:45.479 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:26:45 compute-0 nova_compute[189265]: 2025-09-30 07:26:45.479 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:26:45 compute-0 nova_compute[189265]: 2025-09-30 07:26:45.759 2 INFO nova.compute.manager [-] [instance: ba50db09-103a-463d-9b29-917488cc4974] Took 1.87 seconds to deallocate network for instance.
Sep 30 07:26:46 compute-0 nova_compute[189265]: 2025-09-30 07:26:46.342 2 DEBUG oslo_concurrency.lockutils [None req-7c7f3e2e-121a-41f5-b8a6-9fae09ea2d54 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:26:47 compute-0 nova_compute[189265]: 2025-09-30 07:26:47.042 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Instance ba50db09-103a-463d-9b29-917488cc4974 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 07:26:47 compute-0 nova_compute[189265]: 2025-09-30 07:26:47.043 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:26:47 compute-0 nova_compute[189265]: 2025-09-30 07:26:47.043 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:26:45 up  1:24,  0 user,  load average: 0.21, 0.22, 0.34\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_deleting': '1', 'num_os_type_None': '1', 'num_proj_2ad7bd988b6047509c2c19eb4e0dc32c': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:26:47 compute-0 nova_compute[189265]: 2025-09-30 07:26:47.082 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:26:47 compute-0 nova_compute[189265]: 2025-09-30 07:26:47.589 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:26:47 compute-0 nova_compute[189265]: 2025-09-30 07:26:47.992 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:48 compute-0 nova_compute[189265]: 2025-09-30 07:26:48.097 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:26:48 compute-0 nova_compute[189265]: 2025-09-30 07:26:48.097 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.618s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:26:48 compute-0 nova_compute[189265]: 2025-09-30 07:26:48.097 2 DEBUG oslo_concurrency.lockutils [None req-7c7f3e2e-121a-41f5-b8a6-9fae09ea2d54 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 1.756s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:26:48 compute-0 nova_compute[189265]: 2025-09-30 07:26:48.150 2 DEBUG nova.compute.provider_tree [None req-7c7f3e2e-121a-41f5-b8a6-9fae09ea2d54 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:26:48 compute-0 nova_compute[189265]: 2025-09-30 07:26:48.375 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:48 compute-0 nova_compute[189265]: 2025-09-30 07:26:48.661 2 DEBUG nova.scheduler.client.report [None req-7c7f3e2e-121a-41f5-b8a6-9fae09ea2d54 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:26:49 compute-0 nova_compute[189265]: 2025-09-30 07:26:49.098 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:26:49 compute-0 nova_compute[189265]: 2025-09-30 07:26:49.098 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:26:49 compute-0 nova_compute[189265]: 2025-09-30 07:26:49.174 2 DEBUG oslo_concurrency.lockutils [None req-7c7f3e2e-121a-41f5-b8a6-9fae09ea2d54 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.076s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:26:49 compute-0 nova_compute[189265]: 2025-09-30 07:26:49.209 2 INFO nova.scheduler.client.report [None req-7c7f3e2e-121a-41f5-b8a6-9fae09ea2d54 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Deleted allocations for instance ba50db09-103a-463d-9b29-917488cc4974
Sep 30 07:26:50 compute-0 nova_compute[189265]: 2025-09-30 07:26:50.248 2 DEBUG oslo_concurrency.lockutils [None req-7c7f3e2e-121a-41f5-b8a6-9fae09ea2d54 071bf5838f2f473a865873b6f7846f84 2ad7bd988b6047509c2c19eb4e0dc32c - - default default] Lock "ba50db09-103a-463d-9b29-917488cc4974" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 8.204s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:26:51 compute-0 podman[217547]: 2025-09-30 07:26:51.494651641 +0000 UTC m=+0.081027534 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 07:26:52 compute-0 nova_compute[189265]: 2025-09-30 07:26:52.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:26:52 compute-0 nova_compute[189265]: 2025-09-30 07:26:52.993 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:53 compute-0 nova_compute[189265]: 2025-09-30 07:26:53.377 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:57 compute-0 nova_compute[189265]: 2025-09-30 07:26:57.994 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:58 compute-0 nova_compute[189265]: 2025-09-30 07:26:58.380 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:26:59 compute-0 podman[199733]: time="2025-09-30T07:26:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:26:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:26:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:26:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:26:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3013 "" "Go-http-client/1.1"
Sep 30 07:27:00 compute-0 podman[217571]: 2025-09-30 07:27:00.481538977 +0000 UTC m=+0.068280332 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=iscsid, io.buildah.version=1.41.4, tcib_managed=true, container_name=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 07:27:01 compute-0 openstack_network_exporter[201859]: ERROR   07:27:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:27:01 compute-0 openstack_network_exporter[201859]: ERROR   07:27:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:27:01 compute-0 openstack_network_exporter[201859]: ERROR   07:27:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:27:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:27:01 compute-0 openstack_network_exporter[201859]: ERROR   07:27:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:27:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:27:01 compute-0 openstack_network_exporter[201859]: ERROR   07:27:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:27:02 compute-0 nova_compute[189265]: 2025-09-30 07:27:02.996 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:27:03 compute-0 nova_compute[189265]: 2025-09-30 07:27:03.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:27:03 compute-0 podman[217591]: 2025-09-30 07:27:03.501762511 +0000 UTC m=+0.075344492 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, managed_by=edpm_ansible, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, version=9.6, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 07:27:07 compute-0 podman[217613]: 2025-09-30 07:27:07.487153752 +0000 UTC m=+0.069500207 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4)
Sep 30 07:27:07 compute-0 podman[217614]: 2025-09-30 07:27:07.501104429 +0000 UTC m=+0.068286062 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Sep 30 07:27:07 compute-0 podman[217615]: 2025-09-30 07:27:07.525684817 +0000 UTC m=+0.098743927 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible)
Sep 30 07:27:08 compute-0 nova_compute[189265]: 2025-09-30 07:27:08.000 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:27:08 compute-0 nova_compute[189265]: 2025-09-30 07:27:08.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:27:09 compute-0 nova_compute[189265]: 2025-09-30 07:27:09.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:27:11 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:27:11.388 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '1a:26:7c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2e:60:fa:91:d0:34'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:27:11 compute-0 nova_compute[189265]: 2025-09-30 07:27:11.388 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:27:11 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:27:11.388 100322 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 07:27:13 compute-0 nova_compute[189265]: 2025-09-30 07:27:13.005 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:27:13 compute-0 nova_compute[189265]: 2025-09-30 07:27:13.385 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:27:18 compute-0 nova_compute[189265]: 2025-09-30 07:27:18.008 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:27:18 compute-0 nova_compute[189265]: 2025-09-30 07:27:18.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:27:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:27:20.083 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:46:a6:6e 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-7941131a-79be-4766-9949-6940a349838b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7941131a-79be-4766-9949-6940a349838b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7a0fa2d7b9154a3e805d3e0bc55dba15', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ba7b575f-de57-4775-8893-23f8afd87628, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=5e04d38b-e10a-4d94-bb17-19361a26ba02) old=Port_Binding(mac=['fa:16:3e:46:a6:6e'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-7941131a-79be-4766-9949-6940a349838b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7941131a-79be-4766-9949-6940a349838b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7a0fa2d7b9154a3e805d3e0bc55dba15', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:27:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:27:20.084 100322 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 5e04d38b-e10a-4d94-bb17-19361a26ba02 in datapath 7941131a-79be-4766-9949-6940a349838b updated
Sep 30 07:27:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:27:20.085 100322 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7941131a-79be-4766-9949-6940a349838b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 07:27:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:27:20.086 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[c3a8a2bd-c448-46f3-9a2e-730978868273]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:27:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:27:20.556 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:27:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:27:20.557 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:27:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:27:20.557 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:27:21 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:27:21.392 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=01429670-4ea1-4dab-babc-4bc628cc01bb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:27:22 compute-0 podman[217677]: 2025-09-30 07:27:22.497023652 +0000 UTC m=+0.083980208 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 07:27:23 compute-0 nova_compute[189265]: 2025-09-30 07:27:23.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:27:23 compute-0 nova_compute[189265]: 2025-09-30 07:27:23.391 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:27:28 compute-0 nova_compute[189265]: 2025-09-30 07:27:28.052 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:27:28 compute-0 nova_compute[189265]: 2025-09-30 07:27:28.393 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:27:28 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:27:28.634 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:40:8e:91 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-2ab91f49-0f5b-465f-b8ad-96a8170dbec9', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2ab91f49-0f5b-465f-b8ad-96a8170dbec9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0901fa050bd54a72b891ab273cd6c37d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4dbd857e-990c-4a0d-9b16-b1bbeca10d0f, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=20b77299-f0f1-423c-b518-26a37fff2be6) old=Port_Binding(mac=['fa:16:3e:40:8e:91'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-2ab91f49-0f5b-465f-b8ad-96a8170dbec9', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-2ab91f49-0f5b-465f-b8ad-96a8170dbec9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0901fa050bd54a72b891ab273cd6c37d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:27:28 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:27:28.636 100322 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 20b77299-f0f1-423c-b518-26a37fff2be6 in datapath 2ab91f49-0f5b-465f-b8ad-96a8170dbec9 updated
Sep 30 07:27:28 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:27:28.637 100322 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 2ab91f49-0f5b-465f-b8ad-96a8170dbec9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 07:27:28 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:27:28.638 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[e2fc9229-5c02-4581-aefd-8933b72bc72b]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:27:29 compute-0 podman[199733]: time="2025-09-30T07:27:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:27:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:27:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:27:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:27:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3008 "" "Go-http-client/1.1"
Sep 30 07:27:31 compute-0 openstack_network_exporter[201859]: ERROR   07:27:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:27:31 compute-0 openstack_network_exporter[201859]: ERROR   07:27:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:27:31 compute-0 openstack_network_exporter[201859]: ERROR   07:27:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:27:31 compute-0 openstack_network_exporter[201859]: ERROR   07:27:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:27:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:27:31 compute-0 openstack_network_exporter[201859]: ERROR   07:27:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:27:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:27:31 compute-0 podman[217701]: 2025-09-30 07:27:31.508120195 +0000 UTC m=+0.082379833 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.schema-version=1.0, container_name=iscsid, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Sep 30 07:27:33 compute-0 nova_compute[189265]: 2025-09-30 07:27:33.056 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:27:33 compute-0 nova_compute[189265]: 2025-09-30 07:27:33.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:27:34 compute-0 podman[217721]: 2025-09-30 07:27:34.464137075 +0000 UTC m=+0.056529018 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, name=ubi9-minimal, architecture=x86_64, vendor=Red Hat, Inc., release=1755695350)
Sep 30 07:27:37 compute-0 nova_compute[189265]: 2025-09-30 07:27:37.784 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:27:38 compute-0 nova_compute[189265]: 2025-09-30 07:27:38.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:27:38 compute-0 nova_compute[189265]: 2025-09-30 07:27:38.396 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:27:38 compute-0 podman[217743]: 2025-09-30 07:27:38.51016263 +0000 UTC m=+0.087213070 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 07:27:38 compute-0 podman[217744]: 2025-09-30 07:27:38.519205777 +0000 UTC m=+0.088511517 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4)
Sep 30 07:27:38 compute-0 podman[217745]: 2025-09-30 07:27:38.568438967 +0000 UTC m=+0.135689848 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, io.buildah.version=1.41.4)
Sep 30 07:27:40 compute-0 nova_compute[189265]: 2025-09-30 07:27:40.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:27:41 compute-0 ovn_controller[91436]: 2025-09-30T07:27:41Z|00150|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Sep 30 07:27:41 compute-0 nova_compute[189265]: 2025-09-30 07:27:41.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:27:41 compute-0 nova_compute[189265]: 2025-09-30 07:27:41.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:27:41 compute-0 nova_compute[189265]: 2025-09-30 07:27:41.788 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:27:43 compute-0 nova_compute[189265]: 2025-09-30 07:27:43.062 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:27:43 compute-0 nova_compute[189265]: 2025-09-30 07:27:43.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:27:46 compute-0 nova_compute[189265]: 2025-09-30 07:27:46.783 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:27:47 compute-0 nova_compute[189265]: 2025-09-30 07:27:47.294 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:27:47 compute-0 nova_compute[189265]: 2025-09-30 07:27:47.811 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:27:47 compute-0 nova_compute[189265]: 2025-09-30 07:27:47.812 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:27:47 compute-0 nova_compute[189265]: 2025-09-30 07:27:47.812 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:27:47 compute-0 nova_compute[189265]: 2025-09-30 07:27:47.812 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:27:47 compute-0 nova_compute[189265]: 2025-09-30 07:27:47.962 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:27:47 compute-0 nova_compute[189265]: 2025-09-30 07:27:47.964 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:27:47 compute-0 nova_compute[189265]: 2025-09-30 07:27:47.997 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.033s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:27:47 compute-0 nova_compute[189265]: 2025-09-30 07:27:47.998 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5860MB free_disk=73.30398559570312GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:27:47 compute-0 nova_compute[189265]: 2025-09-30 07:27:47.998 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:27:47 compute-0 nova_compute[189265]: 2025-09-30 07:27:47.998 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:27:48 compute-0 nova_compute[189265]: 2025-09-30 07:27:48.094 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:27:48 compute-0 nova_compute[189265]: 2025-09-30 07:27:48.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:27:49 compute-0 nova_compute[189265]: 2025-09-30 07:27:49.058 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:27:49 compute-0 nova_compute[189265]: 2025-09-30 07:27:49.059 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:27:47 up  1:25,  0 user,  load average: 0.12, 0.19, 0.32\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:27:49 compute-0 nova_compute[189265]: 2025-09-30 07:27:49.086 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:27:49 compute-0 nova_compute[189265]: 2025-09-30 07:27:49.596 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:27:50 compute-0 nova_compute[189265]: 2025-09-30 07:27:50.107 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:27:50 compute-0 nova_compute[189265]: 2025-09-30 07:27:50.108 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.110s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:27:50 compute-0 nova_compute[189265]: 2025-09-30 07:27:50.601 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:27:50 compute-0 nova_compute[189265]: 2025-09-30 07:27:50.602 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:27:52 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:27:52.625 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:09:67:8c 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c99c822b-3191-49e5-b938-903e25b4a9bb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61ab665f922649eba82c57a34e0b452b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0bbcb02d-e040-4e0e-9a60-6466c4420133, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=67b7df48-3f38-444a-8506-1c0ec5bd1d15) old=Port_Binding(mac=['fa:16:3e:09:67:8c'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c99c822b-3191-49e5-b938-903e25b4a9bb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61ab665f922649eba82c57a34e0b452b', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:27:52 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:27:52.626 100322 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 67b7df48-3f38-444a-8506-1c0ec5bd1d15 in datapath c99c822b-3191-49e5-b938-903e25b4a9bb updated
Sep 30 07:27:52 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:27:52.627 100322 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c99c822b-3191-49e5-b938-903e25b4a9bb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 07:27:52 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:27:52.628 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[f845b33d-8afe-40a6-b55d-93a49d644e94]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:27:53 compute-0 nova_compute[189265]: 2025-09-30 07:27:53.097 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:27:53 compute-0 nova_compute[189265]: 2025-09-30 07:27:53.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:27:53 compute-0 podman[217810]: 2025-09-30 07:27:53.489289876 +0000 UTC m=+0.066561233 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 07:27:53 compute-0 nova_compute[189265]: 2025-09-30 07:27:53.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:27:58 compute-0 nova_compute[189265]: 2025-09-30 07:27:58.149 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:27:58 compute-0 nova_compute[189265]: 2025-09-30 07:27:58.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:27:59 compute-0 podman[199733]: time="2025-09-30T07:27:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:27:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:27:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:27:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:27:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3006 "" "Go-http-client/1.1"
Sep 30 07:28:01 compute-0 openstack_network_exporter[201859]: ERROR   07:28:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:28:01 compute-0 openstack_network_exporter[201859]: ERROR   07:28:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:28:01 compute-0 openstack_network_exporter[201859]: ERROR   07:28:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:28:01 compute-0 openstack_network_exporter[201859]: ERROR   07:28:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:28:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:28:01 compute-0 openstack_network_exporter[201859]: ERROR   07:28:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:28:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:28:02 compute-0 podman[217835]: 2025-09-30 07:28:02.497145728 +0000 UTC m=+0.076616329 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, container_name=iscsid, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Sep 30 07:28:03 compute-0 nova_compute[189265]: 2025-09-30 07:28:03.154 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:28:03 compute-0 nova_compute[189265]: 2025-09-30 07:28:03.403 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:28:03 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:28:03.681 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:76:a2:dc 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-b6f0a84f-50c7-4725-a0d2-3755bd5c86fb', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b6f0a84f-50c7-4725-a0d2-3755bd5c86fb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6431607f3dce4c88bbf6d17ee6cd45b2', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2561d954-bc4c-44d3-ada3-61673f0518ee, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=d2bfbca2-e42d-4f46-b7cd-4400e7dfdcf9) old=Port_Binding(mac=['fa:16:3e:76:a2:dc'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-b6f0a84f-50c7-4725-a0d2-3755bd5c86fb', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b6f0a84f-50c7-4725-a0d2-3755bd5c86fb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6431607f3dce4c88bbf6d17ee6cd45b2', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:28:03 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:28:03.682 100322 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port d2bfbca2-e42d-4f46-b7cd-4400e7dfdcf9 in datapath b6f0a84f-50c7-4725-a0d2-3755bd5c86fb updated
Sep 30 07:28:03 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:28:03.684 100322 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b6f0a84f-50c7-4725-a0d2-3755bd5c86fb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 07:28:03 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:28:03.685 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[fa4ad8ba-fe49-4cbe-9df6-8b452322287f]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:28:05 compute-0 podman[217856]: 2025-09-30 07:28:05.505913679 +0000 UTC m=+0.087880450 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, io.openshift.tags=minimal rhel9, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-type=git, managed_by=edpm_ansible)
Sep 30 07:28:08 compute-0 nova_compute[189265]: 2025-09-30 07:28:08.187 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:28:08 compute-0 nova_compute[189265]: 2025-09-30 07:28:08.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:28:09 compute-0 podman[217879]: 2025-09-30 07:28:09.475213398 +0000 UTC m=+0.058886580 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_managed=true, config_id=ovn_metadata_agent)
Sep 30 07:28:09 compute-0 podman[217878]: 2025-09-30 07:28:09.497323144 +0000 UTC m=+0.079712830 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Sep 30 07:28:09 compute-0 podman[217880]: 2025-09-30 07:28:09.56388791 +0000 UTC m=+0.145655578 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Sep 30 07:28:13 compute-0 nova_compute[189265]: 2025-09-30 07:28:13.188 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:28:13 compute-0 nova_compute[189265]: 2025-09-30 07:28:13.406 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:28:15 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:28:15.914 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '1a:26:7c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2e:60:fa:91:d0:34'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:28:15 compute-0 nova_compute[189265]: 2025-09-30 07:28:15.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:28:15 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:28:15.914 100322 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 07:28:16 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:28:16.915 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=01429670-4ea1-4dab-babc-4bc628cc01bb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:28:16 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Sep 30 07:28:18 compute-0 nova_compute[189265]: 2025-09-30 07:28:18.225 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:28:18 compute-0 nova_compute[189265]: 2025-09-30 07:28:18.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:28:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:28:20.558 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:28:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:28:20.558 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:28:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:28:20.558 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:28:22 compute-0 unix_chkpwd[217947]: password check failed for user (root)
Sep 30 07:28:22 compute-0 sshd-session[217945]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.108  user=root
Sep 30 07:28:23 compute-0 nova_compute[189265]: 2025-09-30 07:28:23.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:28:23 compute-0 nova_compute[189265]: 2025-09-30 07:28:23.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:28:24 compute-0 podman[217948]: 2025-09-30 07:28:24.456985323 +0000 UTC m=+0.044248095 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 07:28:24 compute-0 sshd-session[217945]: Failed password for root from 91.224.92.108 port 26818 ssh2
Sep 30 07:28:26 compute-0 unix_chkpwd[217972]: password check failed for user (root)
Sep 30 07:28:28 compute-0 nova_compute[189265]: 2025-09-30 07:28:28.277 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:28:28 compute-0 nova_compute[189265]: 2025-09-30 07:28:28.409 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:28:28 compute-0 sshd-session[217945]: Failed password for root from 91.224.92.108 port 26818 ssh2
Sep 30 07:28:28 compute-0 unix_chkpwd[217973]: password check failed for user (root)
Sep 30 07:28:29 compute-0 podman[199733]: time="2025-09-30T07:28:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:28:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:28:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:28:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:28:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3011 "" "Go-http-client/1.1"
Sep 30 07:28:31 compute-0 sshd-session[217945]: Failed password for root from 91.224.92.108 port 26818 ssh2
Sep 30 07:28:31 compute-0 openstack_network_exporter[201859]: ERROR   07:28:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:28:31 compute-0 openstack_network_exporter[201859]: ERROR   07:28:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:28:31 compute-0 openstack_network_exporter[201859]: ERROR   07:28:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:28:31 compute-0 openstack_network_exporter[201859]: ERROR   07:28:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:28:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:28:31 compute-0 openstack_network_exporter[201859]: ERROR   07:28:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:28:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:28:32 compute-0 sshd-session[217945]: Received disconnect from 91.224.92.108 port 26818:11:  [preauth]
Sep 30 07:28:32 compute-0 sshd-session[217945]: Disconnected from authenticating user root 91.224.92.108 port 26818 [preauth]
Sep 30 07:28:32 compute-0 sshd-session[217945]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.108  user=root
Sep 30 07:28:33 compute-0 nova_compute[189265]: 2025-09-30 07:28:33.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:28:33 compute-0 nova_compute[189265]: 2025-09-30 07:28:33.349 2 DEBUG oslo_concurrency.lockutils [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquiring lock "c461f91a-e2a7-4222-a940-d8ab09ea4807" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:28:33 compute-0 nova_compute[189265]: 2025-09-30 07:28:33.349 2 DEBUG oslo_concurrency.lockutils [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "c461f91a-e2a7-4222-a940-d8ab09ea4807" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:28:33 compute-0 nova_compute[189265]: 2025-09-30 07:28:33.411 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:28:33 compute-0 podman[217976]: 2025-09-30 07:28:33.503589818 +0000 UTC m=+0.077571919 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 07:28:33 compute-0 unix_chkpwd[217996]: password check failed for user (root)
Sep 30 07:28:33 compute-0 sshd-session[217974]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.108  user=root
Sep 30 07:28:33 compute-0 nova_compute[189265]: 2025-09-30 07:28:33.864 2 DEBUG nova.compute.manager [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Sep 30 07:28:34 compute-0 nova_compute[189265]: 2025-09-30 07:28:34.428 2 DEBUG oslo_concurrency.lockutils [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:28:34 compute-0 nova_compute[189265]: 2025-09-30 07:28:34.429 2 DEBUG oslo_concurrency.lockutils [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:28:34 compute-0 nova_compute[189265]: 2025-09-30 07:28:34.438 2 DEBUG nova.virt.hardware [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Sep 30 07:28:34 compute-0 nova_compute[189265]: 2025-09-30 07:28:34.438 2 INFO nova.compute.claims [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Claim successful on node compute-0.ctlplane.example.com
Sep 30 07:28:35 compute-0 nova_compute[189265]: 2025-09-30 07:28:35.499 2 DEBUG nova.compute.provider_tree [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:28:35 compute-0 sshd-session[217974]: Failed password for root from 91.224.92.108 port 32442 ssh2
Sep 30 07:28:35 compute-0 unix_chkpwd[217997]: password check failed for user (root)
Sep 30 07:28:36 compute-0 nova_compute[189265]: 2025-09-30 07:28:36.007 2 DEBUG nova.scheduler.client.report [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:28:36 compute-0 podman[217998]: 2025-09-30 07:28:36.506583508 +0000 UTC m=+0.086603575 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, io.buildah.version=1.33.7, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_id=edpm, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, build-date=2025-08-20T13:12:41, release=1755695350)
Sep 30 07:28:36 compute-0 nova_compute[189265]: 2025-09-30 07:28:36.519 2 DEBUG oslo_concurrency.lockutils [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.090s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:28:36 compute-0 nova_compute[189265]: 2025-09-30 07:28:36.520 2 DEBUG nova.compute.manager [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Sep 30 07:28:37 compute-0 nova_compute[189265]: 2025-09-30 07:28:37.033 2 DEBUG nova.compute.manager [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Sep 30 07:28:37 compute-0 nova_compute[189265]: 2025-09-30 07:28:37.033 2 DEBUG nova.network.neutron [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Sep 30 07:28:37 compute-0 nova_compute[189265]: 2025-09-30 07:28:37.034 2 WARNING neutronclient.v2_0.client [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:28:37 compute-0 nova_compute[189265]: 2025-09-30 07:28:37.034 2 WARNING neutronclient.v2_0.client [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:28:37 compute-0 nova_compute[189265]: 2025-09-30 07:28:37.542 2 INFO nova.virt.libvirt.driver [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 07:28:37 compute-0 sshd-session[217974]: Failed password for root from 91.224.92.108 port 32442 ssh2
Sep 30 07:28:38 compute-0 nova_compute[189265]: 2025-09-30 07:28:38.051 2 DEBUG nova.compute.manager [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Sep 30 07:28:38 compute-0 unix_chkpwd[218020]: password check failed for user (root)
Sep 30 07:28:38 compute-0 nova_compute[189265]: 2025-09-30 07:28:38.280 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:28:38 compute-0 nova_compute[189265]: 2025-09-30 07:28:38.412 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:28:38 compute-0 nova_compute[189265]: 2025-09-30 07:28:38.778 2 DEBUG nova.network.neutron [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Successfully created port: 434f2b55-d79b-4459-9ed7-924027ebd4e4 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Sep 30 07:28:39 compute-0 nova_compute[189265]: 2025-09-30 07:28:39.073 2 DEBUG nova.compute.manager [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Sep 30 07:28:39 compute-0 nova_compute[189265]: 2025-09-30 07:28:39.075 2 DEBUG nova.virt.libvirt.driver [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Sep 30 07:28:39 compute-0 nova_compute[189265]: 2025-09-30 07:28:39.076 2 INFO nova.virt.libvirt.driver [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Creating image(s)
Sep 30 07:28:39 compute-0 nova_compute[189265]: 2025-09-30 07:28:39.076 2 DEBUG oslo_concurrency.lockutils [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquiring lock "/var/lib/nova/instances/c461f91a-e2a7-4222-a940-d8ab09ea4807/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:28:39 compute-0 nova_compute[189265]: 2025-09-30 07:28:39.077 2 DEBUG oslo_concurrency.lockutils [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "/var/lib/nova/instances/c461f91a-e2a7-4222-a940-d8ab09ea4807/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:28:39 compute-0 nova_compute[189265]: 2025-09-30 07:28:39.078 2 DEBUG oslo_concurrency.lockutils [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "/var/lib/nova/instances/c461f91a-e2a7-4222-a940-d8ab09ea4807/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:28:39 compute-0 nova_compute[189265]: 2025-09-30 07:28:39.079 2 DEBUG oslo_utils.imageutils.format_inspector [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:28:39 compute-0 nova_compute[189265]: 2025-09-30 07:28:39.085 2 DEBUG oslo_utils.imageutils.format_inspector [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:28:39 compute-0 nova_compute[189265]: 2025-09-30 07:28:39.089 2 DEBUG oslo_concurrency.processutils [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:28:39 compute-0 nova_compute[189265]: 2025-09-30 07:28:39.175 2 DEBUG oslo_concurrency.processutils [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:28:39 compute-0 nova_compute[189265]: 2025-09-30 07:28:39.176 2 DEBUG oslo_concurrency.lockutils [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquiring lock "649c128805005f3dfb5a93843c58a367cdfe939d" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:28:39 compute-0 nova_compute[189265]: 2025-09-30 07:28:39.177 2 DEBUG oslo_concurrency.lockutils [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "649c128805005f3dfb5a93843c58a367cdfe939d" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:28:39 compute-0 nova_compute[189265]: 2025-09-30 07:28:39.177 2 DEBUG oslo_utils.imageutils.format_inspector [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:28:39 compute-0 nova_compute[189265]: 2025-09-30 07:28:39.180 2 DEBUG oslo_utils.imageutils.format_inspector [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:28:39 compute-0 nova_compute[189265]: 2025-09-30 07:28:39.181 2 DEBUG oslo_concurrency.processutils [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:28:39 compute-0 nova_compute[189265]: 2025-09-30 07:28:39.251 2 DEBUG oslo_concurrency.processutils [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:28:39 compute-0 nova_compute[189265]: 2025-09-30 07:28:39.252 2 DEBUG oslo_concurrency.processutils [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d,backing_fmt=raw /var/lib/nova/instances/c461f91a-e2a7-4222-a940-d8ab09ea4807/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:28:39 compute-0 nova_compute[189265]: 2025-09-30 07:28:39.292 2 DEBUG oslo_concurrency.processutils [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d,backing_fmt=raw /var/lib/nova/instances/c461f91a-e2a7-4222-a940-d8ab09ea4807/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:28:39 compute-0 nova_compute[189265]: 2025-09-30 07:28:39.293 2 DEBUG oslo_concurrency.lockutils [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "649c128805005f3dfb5a93843c58a367cdfe939d" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.116s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:28:39 compute-0 nova_compute[189265]: 2025-09-30 07:28:39.294 2 DEBUG oslo_concurrency.processutils [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:28:39 compute-0 nova_compute[189265]: 2025-09-30 07:28:39.363 2 DEBUG oslo_concurrency.processutils [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:28:39 compute-0 nova_compute[189265]: 2025-09-30 07:28:39.364 2 DEBUG nova.virt.disk.api [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Checking if we can resize image /var/lib/nova/instances/c461f91a-e2a7-4222-a940-d8ab09ea4807/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Sep 30 07:28:39 compute-0 nova_compute[189265]: 2025-09-30 07:28:39.365 2 DEBUG oslo_concurrency.processutils [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c461f91a-e2a7-4222-a940-d8ab09ea4807/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:28:39 compute-0 nova_compute[189265]: 2025-09-30 07:28:39.433 2 DEBUG oslo_concurrency.processutils [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c461f91a-e2a7-4222-a940-d8ab09ea4807/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:28:39 compute-0 nova_compute[189265]: 2025-09-30 07:28:39.434 2 DEBUG nova.virt.disk.api [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Cannot resize image /var/lib/nova/instances/c461f91a-e2a7-4222-a940-d8ab09ea4807/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Sep 30 07:28:39 compute-0 nova_compute[189265]: 2025-09-30 07:28:39.435 2 DEBUG nova.virt.libvirt.driver [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Sep 30 07:28:39 compute-0 nova_compute[189265]: 2025-09-30 07:28:39.435 2 DEBUG nova.virt.libvirt.driver [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Ensure instance console log exists: /var/lib/nova/instances/c461f91a-e2a7-4222-a940-d8ab09ea4807/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 07:28:39 compute-0 nova_compute[189265]: 2025-09-30 07:28:39.435 2 DEBUG oslo_concurrency.lockutils [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:28:39 compute-0 nova_compute[189265]: 2025-09-30 07:28:39.435 2 DEBUG oslo_concurrency.lockutils [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:28:39 compute-0 nova_compute[189265]: 2025-09-30 07:28:39.436 2 DEBUG oslo_concurrency.lockutils [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:28:39 compute-0 nova_compute[189265]: 2025-09-30 07:28:39.783 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:28:39 compute-0 sshd-session[217974]: Failed password for root from 91.224.92.108 port 32442 ssh2
Sep 30 07:28:40 compute-0 sshd-session[217974]: Received disconnect from 91.224.92.108 port 32442:11:  [preauth]
Sep 30 07:28:40 compute-0 sshd-session[217974]: Disconnected from authenticating user root 91.224.92.108 port 32442 [preauth]
Sep 30 07:28:40 compute-0 sshd-session[217974]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.108  user=root
Sep 30 07:28:40 compute-0 podman[218039]: 2025-09-30 07:28:40.485235913 +0000 UTC m=+0.063542651 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Sep 30 07:28:40 compute-0 podman[218036]: 2025-09-30 07:28:40.49393218 +0000 UTC m=+0.070926541 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_id=multipathd, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 07:28:40 compute-0 podman[218040]: 2025-09-30 07:28:40.516941442 +0000 UTC m=+0.092615885 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 07:28:40 compute-0 nova_compute[189265]: 2025-09-30 07:28:40.537 2 DEBUG nova.network.neutron [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Successfully updated port: 434f2b55-d79b-4459-9ed7-924027ebd4e4 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Sep 30 07:28:40 compute-0 nova_compute[189265]: 2025-09-30 07:28:40.593 2 DEBUG nova.compute.manager [req-06b5003a-5846-492c-9d6d-a9ee79f465eb req-da29bc54-90dc-4a32-ab75-17bac94ba7de 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Received event network-changed-434f2b55-d79b-4459-9ed7-924027ebd4e4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:28:40 compute-0 nova_compute[189265]: 2025-09-30 07:28:40.594 2 DEBUG nova.compute.manager [req-06b5003a-5846-492c-9d6d-a9ee79f465eb req-da29bc54-90dc-4a32-ab75-17bac94ba7de 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Refreshing instance network info cache due to event network-changed-434f2b55-d79b-4459-9ed7-924027ebd4e4. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 07:28:40 compute-0 nova_compute[189265]: 2025-09-30 07:28:40.594 2 DEBUG oslo_concurrency.lockutils [req-06b5003a-5846-492c-9d6d-a9ee79f465eb req-da29bc54-90dc-4a32-ab75-17bac94ba7de 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "refresh_cache-c461f91a-e2a7-4222-a940-d8ab09ea4807" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:28:40 compute-0 nova_compute[189265]: 2025-09-30 07:28:40.594 2 DEBUG oslo_concurrency.lockutils [req-06b5003a-5846-492c-9d6d-a9ee79f465eb req-da29bc54-90dc-4a32-ab75-17bac94ba7de 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquired lock "refresh_cache-c461f91a-e2a7-4222-a940-d8ab09ea4807" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:28:40 compute-0 nova_compute[189265]: 2025-09-30 07:28:40.594 2 DEBUG nova.network.neutron [req-06b5003a-5846-492c-9d6d-a9ee79f465eb req-da29bc54-90dc-4a32-ab75-17bac94ba7de 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Refreshing network info cache for port 434f2b55-d79b-4459-9ed7-924027ebd4e4 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 07:28:40 compute-0 nova_compute[189265]: 2025-09-30 07:28:40.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:28:41 compute-0 nova_compute[189265]: 2025-09-30 07:28:41.043 2 DEBUG oslo_concurrency.lockutils [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquiring lock "refresh_cache-c461f91a-e2a7-4222-a940-d8ab09ea4807" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:28:41 compute-0 nova_compute[189265]: 2025-09-30 07:28:41.101 2 WARNING neutronclient.v2_0.client [req-06b5003a-5846-492c-9d6d-a9ee79f465eb req-da29bc54-90dc-4a32-ab75-17bac94ba7de 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:28:41 compute-0 unix_chkpwd[218097]: password check failed for user (root)
Sep 30 07:28:41 compute-0 sshd-session[218037]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.108  user=root
Sep 30 07:28:41 compute-0 nova_compute[189265]: 2025-09-30 07:28:41.592 2 DEBUG nova.network.neutron [req-06b5003a-5846-492c-9d6d-a9ee79f465eb req-da29bc54-90dc-4a32-ab75-17bac94ba7de 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 07:28:41 compute-0 nova_compute[189265]: 2025-09-30 07:28:41.781 2 DEBUG nova.network.neutron [req-06b5003a-5846-492c-9d6d-a9ee79f465eb req-da29bc54-90dc-4a32-ab75-17bac94ba7de 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:28:42 compute-0 nova_compute[189265]: 2025-09-30 07:28:42.289 2 DEBUG oslo_concurrency.lockutils [req-06b5003a-5846-492c-9d6d-a9ee79f465eb req-da29bc54-90dc-4a32-ab75-17bac94ba7de 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Releasing lock "refresh_cache-c461f91a-e2a7-4222-a940-d8ab09ea4807" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:28:42 compute-0 nova_compute[189265]: 2025-09-30 07:28:42.290 2 DEBUG oslo_concurrency.lockutils [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquired lock "refresh_cache-c461f91a-e2a7-4222-a940-d8ab09ea4807" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:28:42 compute-0 nova_compute[189265]: 2025-09-30 07:28:42.290 2 DEBUG nova.network.neutron [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 07:28:42 compute-0 sshd-session[218037]: Failed password for root from 91.224.92.108 port 64620 ssh2
Sep 30 07:28:43 compute-0 nova_compute[189265]: 2025-09-30 07:28:43.282 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:28:43 compute-0 unix_chkpwd[218098]: password check failed for user (root)
Sep 30 07:28:43 compute-0 nova_compute[189265]: 2025-09-30 07:28:43.414 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:28:43 compute-0 nova_compute[189265]: 2025-09-30 07:28:43.581 2 DEBUG nova.network.neutron [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 07:28:43 compute-0 nova_compute[189265]: 2025-09-30 07:28:43.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:28:43 compute-0 nova_compute[189265]: 2025-09-30 07:28:43.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:28:43 compute-0 nova_compute[189265]: 2025-09-30 07:28:43.788 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:28:43 compute-0 nova_compute[189265]: 2025-09-30 07:28:43.833 2 WARNING neutronclient.v2_0.client [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:28:44 compute-0 nova_compute[189265]: 2025-09-30 07:28:44.270 2 DEBUG nova.network.neutron [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Updating instance_info_cache with network_info: [{"id": "434f2b55-d79b-4459-9ed7-924027ebd4e4", "address": "fa:16:3e:9a:b9:c8", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap434f2b55-d7", "ovs_interfaceid": "434f2b55-d79b-4459-9ed7-924027ebd4e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:28:44 compute-0 nova_compute[189265]: 2025-09-30 07:28:44.780 2 DEBUG oslo_concurrency.lockutils [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Releasing lock "refresh_cache-c461f91a-e2a7-4222-a940-d8ab09ea4807" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:28:44 compute-0 nova_compute[189265]: 2025-09-30 07:28:44.781 2 DEBUG nova.compute.manager [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Instance network_info: |[{"id": "434f2b55-d79b-4459-9ed7-924027ebd4e4", "address": "fa:16:3e:9a:b9:c8", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap434f2b55-d7", "ovs_interfaceid": "434f2b55-d79b-4459-9ed7-924027ebd4e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Sep 30 07:28:44 compute-0 nova_compute[189265]: 2025-09-30 07:28:44.785 2 DEBUG nova.virt.libvirt.driver [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Start _get_guest_xml network_info=[{"id": "434f2b55-d79b-4459-9ed7-924027ebd4e4", "address": "fa:16:3e:9a:b9:c8", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap434f2b55-d7", "ovs_interfaceid": "434f2b55-d79b-4459-9ed7-924027ebd4e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T07:07:59Z,direct_url=<?>,disk_format='qcow2',id=0c6b92f5-9861-49e4-862d-3ffd84520dfa,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4049964ce8244dacb50493f6676c6613',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T07:08:00Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'encryption_format': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encrypted': False, 'image_id': '0c6b92f5-9861-49e4-862d-3ffd84520dfa'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Sep 30 07:28:44 compute-0 nova_compute[189265]: 2025-09-30 07:28:44.789 2 WARNING nova.virt.libvirt.driver [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:28:44 compute-0 nova_compute[189265]: 2025-09-30 07:28:44.791 2 DEBUG nova.virt.driver [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='0c6b92f5-9861-49e4-862d-3ffd84520dfa', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteStrategies-server-1498059428', uuid='c461f91a-e2a7-4222-a940-d8ab09ea4807'), owner=OwnerMeta(userid='89ba5d19014145188ad2a3c812acdc88', username='tempest-TestExecuteStrategies-1096120513-project-admin', projectid='6431607f3dce4c88bbf6d17ee6cd45b2', projectname='tempest-TestExecuteStrategies-1096120513'), image=ImageMeta(id='0c6b92f5-9861-49e4-862d-3ffd84520dfa', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='ded17455-f8fe-40c7-8dae-6f0a2b208ae0', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "434f2b55-d79b-4459-9ed7-924027ebd4e4", "address": "fa:16:3e:9a:b9:c8", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap434f2b55-d7", "ovs_interfaceid": "434f2b55-d79b-4459-9ed7-924027ebd4e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759217324.7917476) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Sep 30 07:28:44 compute-0 nova_compute[189265]: 2025-09-30 07:28:44.799 2 DEBUG nova.virt.libvirt.host [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Sep 30 07:28:44 compute-0 nova_compute[189265]: 2025-09-30 07:28:44.800 2 DEBUG nova.virt.libvirt.host [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Sep 30 07:28:44 compute-0 nova_compute[189265]: 2025-09-30 07:28:44.808 2 DEBUG nova.virt.libvirt.host [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Sep 30 07:28:44 compute-0 nova_compute[189265]: 2025-09-30 07:28:44.808 2 DEBUG nova.virt.libvirt.host [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Sep 30 07:28:44 compute-0 nova_compute[189265]: 2025-09-30 07:28:44.809 2 DEBUG nova.virt.libvirt.driver [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 07:28:44 compute-0 nova_compute[189265]: 2025-09-30 07:28:44.809 2 DEBUG nova.virt.hardware [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T07:07:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='ded17455-f8fe-40c7-8dae-6f0a2b208ae0',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T07:07:59Z,direct_url=<?>,disk_format='qcow2',id=0c6b92f5-9861-49e4-862d-3ffd84520dfa,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4049964ce8244dacb50493f6676c6613',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T07:08:00Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Sep 30 07:28:44 compute-0 nova_compute[189265]: 2025-09-30 07:28:44.810 2 DEBUG nova.virt.hardware [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Sep 30 07:28:44 compute-0 nova_compute[189265]: 2025-09-30 07:28:44.811 2 DEBUG nova.virt.hardware [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Sep 30 07:28:44 compute-0 nova_compute[189265]: 2025-09-30 07:28:44.811 2 DEBUG nova.virt.hardware [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Sep 30 07:28:44 compute-0 nova_compute[189265]: 2025-09-30 07:28:44.812 2 DEBUG nova.virt.hardware [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Sep 30 07:28:44 compute-0 nova_compute[189265]: 2025-09-30 07:28:44.812 2 DEBUG nova.virt.hardware [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Sep 30 07:28:44 compute-0 nova_compute[189265]: 2025-09-30 07:28:44.812 2 DEBUG nova.virt.hardware [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Sep 30 07:28:44 compute-0 nova_compute[189265]: 2025-09-30 07:28:44.813 2 DEBUG nova.virt.hardware [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Sep 30 07:28:44 compute-0 nova_compute[189265]: 2025-09-30 07:28:44.813 2 DEBUG nova.virt.hardware [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Sep 30 07:28:44 compute-0 nova_compute[189265]: 2025-09-30 07:28:44.814 2 DEBUG nova.virt.hardware [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Sep 30 07:28:44 compute-0 nova_compute[189265]: 2025-09-30 07:28:44.814 2 DEBUG nova.virt.hardware [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Sep 30 07:28:44 compute-0 nova_compute[189265]: 2025-09-30 07:28:44.820 2 DEBUG nova.virt.libvirt.vif [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T07:28:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1498059428',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1498059428',id=15,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6431607f3dce4c88bbf6d17ee6cd45b2',ramdisk_id='',reservation_id='r-1pekgb53',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1096120513',owner_user_name='tempest-TestExecuteStrategies-1096120513-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T07:28:38Z,user_data=None,user_id='89ba5d19014145188ad2a3c812acdc88',uuid=c461f91a-e2a7-4222-a940-d8ab09ea4807,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "434f2b55-d79b-4459-9ed7-924027ebd4e4", "address": "fa:16:3e:9a:b9:c8", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap434f2b55-d7", "ovs_interfaceid": "434f2b55-d79b-4459-9ed7-924027ebd4e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 07:28:44 compute-0 nova_compute[189265]: 2025-09-30 07:28:44.821 2 DEBUG nova.network.os_vif_util [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Converting VIF {"id": "434f2b55-d79b-4459-9ed7-924027ebd4e4", "address": "fa:16:3e:9a:b9:c8", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap434f2b55-d7", "ovs_interfaceid": "434f2b55-d79b-4459-9ed7-924027ebd4e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:28:44 compute-0 nova_compute[189265]: 2025-09-30 07:28:44.822 2 DEBUG nova.network.os_vif_util [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9a:b9:c8,bridge_name='br-int',has_traffic_filtering=True,id=434f2b55-d79b-4459-9ed7-924027ebd4e4,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap434f2b55-d7') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:28:44 compute-0 nova_compute[189265]: 2025-09-30 07:28:44.823 2 DEBUG nova.objects.instance [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lazy-loading 'pci_devices' on Instance uuid c461f91a-e2a7-4222-a940-d8ab09ea4807 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:28:45 compute-0 sshd-session[218037]: Failed password for root from 91.224.92.108 port 64620 ssh2
Sep 30 07:28:45 compute-0 nova_compute[189265]: 2025-09-30 07:28:45.333 2 DEBUG nova.virt.libvirt.driver [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] End _get_guest_xml xml=<domain type="kvm">
Sep 30 07:28:45 compute-0 nova_compute[189265]:   <uuid>c461f91a-e2a7-4222-a940-d8ab09ea4807</uuid>
Sep 30 07:28:45 compute-0 nova_compute[189265]:   <name>instance-0000000f</name>
Sep 30 07:28:45 compute-0 nova_compute[189265]:   <memory>131072</memory>
Sep 30 07:28:45 compute-0 nova_compute[189265]:   <vcpu>1</vcpu>
Sep 30 07:28:45 compute-0 nova_compute[189265]:   <metadata>
Sep 30 07:28:45 compute-0 nova_compute[189265]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 07:28:45 compute-0 nova_compute[189265]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:       <nova:name>tempest-TestExecuteStrategies-server-1498059428</nova:name>
Sep 30 07:28:45 compute-0 nova_compute[189265]:       <nova:creationTime>2025-09-30 07:28:44</nova:creationTime>
Sep 30 07:28:45 compute-0 nova_compute[189265]:       <nova:flavor name="m1.nano" id="ded17455-f8fe-40c7-8dae-6f0a2b208ae0">
Sep 30 07:28:45 compute-0 nova_compute[189265]:         <nova:memory>128</nova:memory>
Sep 30 07:28:45 compute-0 nova_compute[189265]:         <nova:disk>1</nova:disk>
Sep 30 07:28:45 compute-0 nova_compute[189265]:         <nova:swap>0</nova:swap>
Sep 30 07:28:45 compute-0 nova_compute[189265]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 07:28:45 compute-0 nova_compute[189265]:         <nova:vcpus>1</nova:vcpus>
Sep 30 07:28:45 compute-0 nova_compute[189265]:         <nova:extraSpecs>
Sep 30 07:28:45 compute-0 nova_compute[189265]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 07:28:45 compute-0 nova_compute[189265]:         </nova:extraSpecs>
Sep 30 07:28:45 compute-0 nova_compute[189265]:       </nova:flavor>
Sep 30 07:28:45 compute-0 nova_compute[189265]:       <nova:image uuid="0c6b92f5-9861-49e4-862d-3ffd84520dfa">
Sep 30 07:28:45 compute-0 nova_compute[189265]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 07:28:45 compute-0 nova_compute[189265]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 07:28:45 compute-0 nova_compute[189265]:         <nova:minDisk>1</nova:minDisk>
Sep 30 07:28:45 compute-0 nova_compute[189265]:         <nova:minRam>0</nova:minRam>
Sep 30 07:28:45 compute-0 nova_compute[189265]:         <nova:properties>
Sep 30 07:28:45 compute-0 nova_compute[189265]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 07:28:45 compute-0 nova_compute[189265]:         </nova:properties>
Sep 30 07:28:45 compute-0 nova_compute[189265]:       </nova:image>
Sep 30 07:28:45 compute-0 nova_compute[189265]:       <nova:owner>
Sep 30 07:28:45 compute-0 nova_compute[189265]:         <nova:user uuid="89ba5d19014145188ad2a3c812acdc88">tempest-TestExecuteStrategies-1096120513-project-admin</nova:user>
Sep 30 07:28:45 compute-0 nova_compute[189265]:         <nova:project uuid="6431607f3dce4c88bbf6d17ee6cd45b2">tempest-TestExecuteStrategies-1096120513</nova:project>
Sep 30 07:28:45 compute-0 nova_compute[189265]:       </nova:owner>
Sep 30 07:28:45 compute-0 nova_compute[189265]:       <nova:root type="image" uuid="0c6b92f5-9861-49e4-862d-3ffd84520dfa"/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:       <nova:ports>
Sep 30 07:28:45 compute-0 nova_compute[189265]:         <nova:port uuid="434f2b55-d79b-4459-9ed7-924027ebd4e4">
Sep 30 07:28:45 compute-0 nova_compute[189265]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:         </nova:port>
Sep 30 07:28:45 compute-0 nova_compute[189265]:       </nova:ports>
Sep 30 07:28:45 compute-0 nova_compute[189265]:     </nova:instance>
Sep 30 07:28:45 compute-0 nova_compute[189265]:   </metadata>
Sep 30 07:28:45 compute-0 nova_compute[189265]:   <sysinfo type="smbios">
Sep 30 07:28:45 compute-0 nova_compute[189265]:     <system>
Sep 30 07:28:45 compute-0 nova_compute[189265]:       <entry name="manufacturer">RDO</entry>
Sep 30 07:28:45 compute-0 nova_compute[189265]:       <entry name="product">OpenStack Compute</entry>
Sep 30 07:28:45 compute-0 nova_compute[189265]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 07:28:45 compute-0 nova_compute[189265]:       <entry name="serial">c461f91a-e2a7-4222-a940-d8ab09ea4807</entry>
Sep 30 07:28:45 compute-0 nova_compute[189265]:       <entry name="uuid">c461f91a-e2a7-4222-a940-d8ab09ea4807</entry>
Sep 30 07:28:45 compute-0 nova_compute[189265]:       <entry name="family">Virtual Machine</entry>
Sep 30 07:28:45 compute-0 nova_compute[189265]:     </system>
Sep 30 07:28:45 compute-0 nova_compute[189265]:   </sysinfo>
Sep 30 07:28:45 compute-0 nova_compute[189265]:   <os>
Sep 30 07:28:45 compute-0 nova_compute[189265]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 07:28:45 compute-0 nova_compute[189265]:     <boot dev="hd"/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:     <smbios mode="sysinfo"/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:   </os>
Sep 30 07:28:45 compute-0 nova_compute[189265]:   <features>
Sep 30 07:28:45 compute-0 nova_compute[189265]:     <acpi/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:     <apic/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:     <vmcoreinfo/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:   </features>
Sep 30 07:28:45 compute-0 nova_compute[189265]:   <clock offset="utc">
Sep 30 07:28:45 compute-0 nova_compute[189265]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:     <timer name="hpet" present="no"/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:   </clock>
Sep 30 07:28:45 compute-0 nova_compute[189265]:   <cpu mode="host-model" match="exact">
Sep 30 07:28:45 compute-0 nova_compute[189265]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:   </cpu>
Sep 30 07:28:45 compute-0 nova_compute[189265]:   <devices>
Sep 30 07:28:45 compute-0 nova_compute[189265]:     <disk type="file" device="disk">
Sep 30 07:28:45 compute-0 nova_compute[189265]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:       <source file="/var/lib/nova/instances/c461f91a-e2a7-4222-a940-d8ab09ea4807/disk"/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:       <target dev="vda" bus="virtio"/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:     </disk>
Sep 30 07:28:45 compute-0 nova_compute[189265]:     <disk type="file" device="cdrom">
Sep 30 07:28:45 compute-0 nova_compute[189265]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:       <source file="/var/lib/nova/instances/c461f91a-e2a7-4222-a940-d8ab09ea4807/disk.config"/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:       <target dev="sda" bus="sata"/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:     </disk>
Sep 30 07:28:45 compute-0 nova_compute[189265]:     <interface type="ethernet">
Sep 30 07:28:45 compute-0 nova_compute[189265]:       <mac address="fa:16:3e:9a:b9:c8"/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:       <model type="virtio"/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:       <mtu size="1442"/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:       <target dev="tap434f2b55-d7"/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:     </interface>
Sep 30 07:28:45 compute-0 nova_compute[189265]:     <serial type="pty">
Sep 30 07:28:45 compute-0 nova_compute[189265]:       <log file="/var/lib/nova/instances/c461f91a-e2a7-4222-a940-d8ab09ea4807/console.log" append="off"/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:     </serial>
Sep 30 07:28:45 compute-0 nova_compute[189265]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:     <video>
Sep 30 07:28:45 compute-0 nova_compute[189265]:       <model type="virtio"/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:     </video>
Sep 30 07:28:45 compute-0 nova_compute[189265]:     <input type="tablet" bus="usb"/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:     <rng model="virtio">
Sep 30 07:28:45 compute-0 nova_compute[189265]:       <backend model="random">/dev/urandom</backend>
Sep 30 07:28:45 compute-0 nova_compute[189265]:     </rng>
Sep 30 07:28:45 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root"/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:     <controller type="usb" index="0"/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 07:28:45 compute-0 nova_compute[189265]:       <stats period="10"/>
Sep 30 07:28:45 compute-0 nova_compute[189265]:     </memballoon>
Sep 30 07:28:45 compute-0 nova_compute[189265]:   </devices>
Sep 30 07:28:45 compute-0 nova_compute[189265]: </domain>
Sep 30 07:28:45 compute-0 nova_compute[189265]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Sep 30 07:28:45 compute-0 nova_compute[189265]: 2025-09-30 07:28:45.334 2 DEBUG nova.compute.manager [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Preparing to wait for external event network-vif-plugged-434f2b55-d79b-4459-9ed7-924027ebd4e4 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 07:28:45 compute-0 nova_compute[189265]: 2025-09-30 07:28:45.335 2 DEBUG oslo_concurrency.lockutils [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquiring lock "c461f91a-e2a7-4222-a940-d8ab09ea4807-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:28:45 compute-0 nova_compute[189265]: 2025-09-30 07:28:45.335 2 DEBUG oslo_concurrency.lockutils [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "c461f91a-e2a7-4222-a940-d8ab09ea4807-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:28:45 compute-0 nova_compute[189265]: 2025-09-30 07:28:45.335 2 DEBUG oslo_concurrency.lockutils [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "c461f91a-e2a7-4222-a940-d8ab09ea4807-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:28:45 compute-0 nova_compute[189265]: 2025-09-30 07:28:45.337 2 DEBUG nova.virt.libvirt.vif [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T07:28:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1498059428',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1498059428',id=15,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6431607f3dce4c88bbf6d17ee6cd45b2',ramdisk_id='',reservation_id='r-1pekgb53',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1096120513',owner_user_name='tempest-TestExecuteStrategies-1096120513-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T07:28:38Z,user_data=None,user_id='89ba5d19014145188ad2a3c812acdc88',uuid=c461f91a-e2a7-4222-a940-d8ab09ea4807,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "434f2b55-d79b-4459-9ed7-924027ebd4e4", "address": "fa:16:3e:9a:b9:c8", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap434f2b55-d7", "ovs_interfaceid": "434f2b55-d79b-4459-9ed7-924027ebd4e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 07:28:45 compute-0 nova_compute[189265]: 2025-09-30 07:28:45.337 2 DEBUG nova.network.os_vif_util [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Converting VIF {"id": "434f2b55-d79b-4459-9ed7-924027ebd4e4", "address": "fa:16:3e:9a:b9:c8", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap434f2b55-d7", "ovs_interfaceid": "434f2b55-d79b-4459-9ed7-924027ebd4e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:28:45 compute-0 nova_compute[189265]: 2025-09-30 07:28:45.338 2 DEBUG nova.network.os_vif_util [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9a:b9:c8,bridge_name='br-int',has_traffic_filtering=True,id=434f2b55-d79b-4459-9ed7-924027ebd4e4,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap434f2b55-d7') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:28:45 compute-0 nova_compute[189265]: 2025-09-30 07:28:45.338 2 DEBUG os_vif [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:b9:c8,bridge_name='br-int',has_traffic_filtering=True,id=434f2b55-d79b-4459-9ed7-924027ebd4e4,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap434f2b55-d7') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 07:28:45 compute-0 nova_compute[189265]: 2025-09-30 07:28:45.339 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:28:45 compute-0 nova_compute[189265]: 2025-09-30 07:28:45.340 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:28:45 compute-0 nova_compute[189265]: 2025-09-30 07:28:45.340 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:28:45 compute-0 nova_compute[189265]: 2025-09-30 07:28:45.341 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:28:45 compute-0 nova_compute[189265]: 2025-09-30 07:28:45.341 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '08906ed8-9110-577d-87ae-2267212cb7df', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:28:45 compute-0 nova_compute[189265]: 2025-09-30 07:28:45.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:28:45 compute-0 nova_compute[189265]: 2025-09-30 07:28:45.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:28:45 compute-0 nova_compute[189265]: 2025-09-30 07:28:45.349 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:28:45 compute-0 nova_compute[189265]: 2025-09-30 07:28:45.350 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap434f2b55-d7, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:28:45 compute-0 nova_compute[189265]: 2025-09-30 07:28:45.350 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap434f2b55-d7, col_values=(('qos', UUID('11fe5315-73cc-4236-95f2-f863dedfb66c')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:28:45 compute-0 nova_compute[189265]: 2025-09-30 07:28:45.351 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap434f2b55-d7, col_values=(('external_ids', {'iface-id': '434f2b55-d79b-4459-9ed7-924027ebd4e4', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9a:b9:c8', 'vm-uuid': 'c461f91a-e2a7-4222-a940-d8ab09ea4807'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:28:45 compute-0 nova_compute[189265]: 2025-09-30 07:28:45.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:28:45 compute-0 NetworkManager[51813]: <info>  [1759217325.3540] manager: (tap434f2b55-d7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/53)
Sep 30 07:28:45 compute-0 nova_compute[189265]: 2025-09-30 07:28:45.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:28:45 compute-0 nova_compute[189265]: 2025-09-30 07:28:45.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:28:45 compute-0 nova_compute[189265]: 2025-09-30 07:28:45.360 2 INFO os_vif [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:b9:c8,bridge_name='br-int',has_traffic_filtering=True,id=434f2b55-d79b-4459-9ed7-924027ebd4e4,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap434f2b55-d7')
Sep 30 07:28:45 compute-0 unix_chkpwd[218101]: password check failed for user (root)
Sep 30 07:28:46 compute-0 nova_compute[189265]: 2025-09-30 07:28:46.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:28:46 compute-0 nova_compute[189265]: 2025-09-30 07:28:46.915 2 DEBUG nova.virt.libvirt.driver [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 07:28:46 compute-0 nova_compute[189265]: 2025-09-30 07:28:46.916 2 DEBUG nova.virt.libvirt.driver [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 07:28:46 compute-0 nova_compute[189265]: 2025-09-30 07:28:46.916 2 DEBUG nova.virt.libvirt.driver [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] No VIF found with MAC fa:16:3e:9a:b9:c8, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 07:28:46 compute-0 nova_compute[189265]: 2025-09-30 07:28:46.918 2 INFO nova.virt.libvirt.driver [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Using config drive
Sep 30 07:28:47 compute-0 nova_compute[189265]: 2025-09-30 07:28:47.310 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:28:47 compute-0 nova_compute[189265]: 2025-09-30 07:28:47.310 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:28:47 compute-0 nova_compute[189265]: 2025-09-30 07:28:47.311 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:28:47 compute-0 nova_compute[189265]: 2025-09-30 07:28:47.311 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:28:47 compute-0 nova_compute[189265]: 2025-09-30 07:28:47.443 2 WARNING neutronclient.v2_0.client [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:28:47 compute-0 sshd-session[218037]: Failed password for root from 91.224.92.108 port 64620 ssh2
Sep 30 07:28:47 compute-0 nova_compute[189265]: 2025-09-30 07:28:47.646 2 INFO nova.virt.libvirt.driver [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Creating config drive at /var/lib/nova/instances/c461f91a-e2a7-4222-a940-d8ab09ea4807/disk.config
Sep 30 07:28:47 compute-0 nova_compute[189265]: 2025-09-30 07:28:47.652 2 DEBUG oslo_concurrency.processutils [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c461f91a-e2a7-4222-a940-d8ab09ea4807/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmp6cxkzs5l execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:28:47 compute-0 nova_compute[189265]: 2025-09-30 07:28:47.796 2 DEBUG oslo_concurrency.processutils [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c461f91a-e2a7-4222-a940-d8ab09ea4807/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmp6cxkzs5l" returned: 0 in 0.143s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:28:47 compute-0 kernel: tap434f2b55-d7: entered promiscuous mode
Sep 30 07:28:47 compute-0 NetworkManager[51813]: <info>  [1759217327.9043] manager: (tap434f2b55-d7): new Tun device (/org/freedesktop/NetworkManager/Devices/54)
Sep 30 07:28:47 compute-0 ovn_controller[91436]: 2025-09-30T07:28:47Z|00151|binding|INFO|Claiming lport 434f2b55-d79b-4459-9ed7-924027ebd4e4 for this chassis.
Sep 30 07:28:47 compute-0 ovn_controller[91436]: 2025-09-30T07:28:47Z|00152|binding|INFO|434f2b55-d79b-4459-9ed7-924027ebd4e4: Claiming fa:16:3e:9a:b9:c8 10.100.0.6
Sep 30 07:28:47 compute-0 nova_compute[189265]: 2025-09-30 07:28:47.907 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:28:47 compute-0 nova_compute[189265]: 2025-09-30 07:28:47.911 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:28:47 compute-0 nova_compute[189265]: 2025-09-30 07:28:47.918 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:28:47 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:28:47.928 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9a:b9:c8 10.100.0.6'], port_security=['fa:16:3e:9a:b9:c8 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'c461f91a-e2a7-4222-a940-d8ab09ea4807', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c99c822b-3191-49e5-b938-903e25b4a9bb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6431607f3dce4c88bbf6d17ee6cd45b2', 'neutron:revision_number': '4', 'neutron:security_group_ids': '39e9818d-6ede-4a3d-b6e2-a5ad3a4c803a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0bbcb02d-e040-4e0e-9a60-6466c4420133, chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], logical_port=434f2b55-d79b-4459-9ed7-924027ebd4e4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:28:47 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:28:47.929 100322 INFO neutron.agent.ovn.metadata.agent [-] Port 434f2b55-d79b-4459-9ed7-924027ebd4e4 in datapath c99c822b-3191-49e5-b938-903e25b4a9bb bound to our chassis
Sep 30 07:28:47 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:28:47.932 100322 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c99c822b-3191-49e5-b938-903e25b4a9bb
Sep 30 07:28:47 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:28:47.951 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[fc1d3879-ca0e-44fc-a28b-f96374b3da57]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:28:47 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:28:47.952 100322 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc99c822b-31 in ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Sep 30 07:28:47 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:28:47.955 210650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc99c822b-30 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Sep 30 07:28:47 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:28:47.955 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[75f0fe9f-0c37-4968-9e24-3e7fb519a5f0]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:28:47 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:28:47.956 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[49b11789-1e6c-4316-97ab-7aef900d2152]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:28:47 compute-0 systemd-machined[149233]: New machine qemu-11-instance-0000000f.
Sep 30 07:28:47 compute-0 systemd-udevd[218122]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 07:28:47 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:28:47.974 100440 DEBUG oslo.privsep.daemon [-] privsep: reply[1548b3bc-adc0-4910-9d9d-19b4006bf539]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:28:47 compute-0 NetworkManager[51813]: <info>  [1759217327.9893] device (tap434f2b55-d7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 07:28:47 compute-0 NetworkManager[51813]: <info>  [1759217327.9912] device (tap434f2b55-d7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 07:28:47 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:28:47.997 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[f0fb0876-152f-4c52-8bae-7ec4b9d5d830]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:28:48 compute-0 nova_compute[189265]: 2025-09-30 07:28:47.999 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:28:48 compute-0 ovn_controller[91436]: 2025-09-30T07:28:48Z|00153|binding|INFO|Setting lport 434f2b55-d79b-4459-9ed7-924027ebd4e4 ovn-installed in OVS
Sep 30 07:28:48 compute-0 ovn_controller[91436]: 2025-09-30T07:28:48Z|00154|binding|INFO|Setting lport 434f2b55-d79b-4459-9ed7-924027ebd4e4 up in Southbound
Sep 30 07:28:48 compute-0 nova_compute[189265]: 2025-09-30 07:28:48.004 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:28:48 compute-0 systemd[1]: Started Virtual Machine qemu-11-instance-0000000f.
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:28:48.045 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[e8483af0-0e8d-4f89-93e7-e45565ac6ae5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:28:48 compute-0 systemd-udevd[218126]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:28:48.052 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[a00559d6-2967-460a-b2b2-eb4b730af173]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:28:48 compute-0 NetworkManager[51813]: <info>  [1759217328.0544] manager: (tapc99c822b-30): new Veth device (/org/freedesktop/NetworkManager/Devices/55)
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:28:48.101 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[84e585b5-c304-4abd-b5e7-895c08fa4800]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:28:48.104 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[f5ccdff5-2926-4adc-8dc1-7151df801df7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:28:48 compute-0 NetworkManager[51813]: <info>  [1759217328.1397] device (tapc99c822b-30): carrier: link connected
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:28:48.147 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[337792fc-4022-4b92-ab85-d205ca0acd85]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:28:48.170 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[2611c254-644c-452f-bd04-333f7ed110f4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc99c822b-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:67:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 518586, 'reachable_time': 27010, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218154, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:28:48.191 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[357a5cd9-38b0-45b1-8644-d142fb27728f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe09:678c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 518586, 'tstamp': 518586}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218155, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:28:48.211 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[9a528d70-d6db-4ce6-9021-f87b6a6afef8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc99c822b-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:67:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 518586, 'reachable_time': 27010, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218156, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:28:48.255 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[371365dc-d540-4de0-9fb5-297eb6217792]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:28:48 compute-0 nova_compute[189265]: 2025-09-30 07:28:48.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:28:48.334 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[71000c76-0fda-4e01-a626-800330a9fcaa]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:28:48.335 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc99c822b-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:28:48.335 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:28:48.336 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc99c822b-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:28:48 compute-0 nova_compute[189265]: 2025-09-30 07:28:48.337 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:28:48 compute-0 kernel: tapc99c822b-30: entered promiscuous mode
Sep 30 07:28:48 compute-0 nova_compute[189265]: 2025-09-30 07:28:48.339 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:28:48.339 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc99c822b-30, col_values=(('external_ids', {'iface-id': '67b7df48-3f38-444a-8506-1c0ec5bd1d15'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:28:48 compute-0 NetworkManager[51813]: <info>  [1759217328.3404] manager: (tapc99c822b-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/56)
Sep 30 07:28:48 compute-0 nova_compute[189265]: 2025-09-30 07:28:48.341 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:28:48 compute-0 ovn_controller[91436]: 2025-09-30T07:28:48Z|00155|binding|INFO|Releasing lport 67b7df48-3f38-444a-8506-1c0ec5bd1d15 from this chassis (sb_readonly=0)
Sep 30 07:28:48 compute-0 nova_compute[189265]: 2025-09-30 07:28:48.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:28:48.357 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[47461ad7-58ba-4e0b-ac3f-fb6fb4a2c257]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:28:48.357 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:28:48.357 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:28:48.357 100322 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for c99c822b-3191-49e5-b938-903e25b4a9bb disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:28:48.358 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:28:48.358 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[74ea6a58-d0f1-4057-9474-c68e89cb4034]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:28:48.358 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:28:48.359 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[98740740-05c2-4a84-acc5-10b89fd55378]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:28:48.359 100322 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]: global
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]:     log         /dev/log local0 debug
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]:     log-tag     haproxy-metadata-proxy-c99c822b-3191-49e5-b938-903e25b4a9bb
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]:     user        root
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]:     group       root
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]:     maxconn     1024
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]:     pidfile     /var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]:     daemon
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]: 
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]: defaults
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]:     log global
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]:     mode http
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]:     option httplog
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]:     option dontlognull
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]:     option http-server-close
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]:     option forwardfor
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]:     retries                 3
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]:     timeout http-request    30s
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]:     timeout connect         30s
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]:     timeout client          32s
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]:     timeout server          32s
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]:     timeout http-keep-alive 30s
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]: 
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]: listen listener
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]:     bind 169.254.169.254:80
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]:     
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]: 
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]:     http-request add-header X-OVN-Network-ID c99c822b-3191-49e5-b938-903e25b4a9bb
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Sep 30 07:28:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:28:48.361 100322 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'env', 'PROCESS_TAG=haproxy-c99c822b-3191-49e5-b938-903e25b4a9bb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c99c822b-3191-49e5-b938-903e25b4a9bb.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Sep 30 07:28:48 compute-0 nova_compute[189265]: 2025-09-30 07:28:48.363 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c461f91a-e2a7-4222-a940-d8ab09ea4807/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:28:48 compute-0 nova_compute[189265]: 2025-09-30 07:28:48.448 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c461f91a-e2a7-4222-a940-d8ab09ea4807/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:28:48 compute-0 nova_compute[189265]: 2025-09-30 07:28:48.449 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c461f91a-e2a7-4222-a940-d8ab09ea4807/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:28:48 compute-0 nova_compute[189265]: 2025-09-30 07:28:48.518 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c461f91a-e2a7-4222-a940-d8ab09ea4807/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:28:48 compute-0 nova_compute[189265]: 2025-09-30 07:28:48.719 2 DEBUG nova.compute.manager [req-a1918daa-4fc3-480d-a884-1f17086a89ae req-e4de0371-d12c-461f-be56-246cd01085a2 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Received event network-vif-plugged-434f2b55-d79b-4459-9ed7-924027ebd4e4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:28:48 compute-0 nova_compute[189265]: 2025-09-30 07:28:48.720 2 DEBUG oslo_concurrency.lockutils [req-a1918daa-4fc3-480d-a884-1f17086a89ae req-e4de0371-d12c-461f-be56-246cd01085a2 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "c461f91a-e2a7-4222-a940-d8ab09ea4807-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:28:48 compute-0 nova_compute[189265]: 2025-09-30 07:28:48.720 2 DEBUG oslo_concurrency.lockutils [req-a1918daa-4fc3-480d-a884-1f17086a89ae req-e4de0371-d12c-461f-be56-246cd01085a2 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "c461f91a-e2a7-4222-a940-d8ab09ea4807-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:28:48 compute-0 nova_compute[189265]: 2025-09-30 07:28:48.720 2 DEBUG oslo_concurrency.lockutils [req-a1918daa-4fc3-480d-a884-1f17086a89ae req-e4de0371-d12c-461f-be56-246cd01085a2 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "c461f91a-e2a7-4222-a940-d8ab09ea4807-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:28:48 compute-0 nova_compute[189265]: 2025-09-30 07:28:48.720 2 DEBUG nova.compute.manager [req-a1918daa-4fc3-480d-a884-1f17086a89ae req-e4de0371-d12c-461f-be56-246cd01085a2 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Processing event network-vif-plugged-434f2b55-d79b-4459-9ed7-924027ebd4e4 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 07:28:48 compute-0 nova_compute[189265]: 2025-09-30 07:28:48.733 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:28:48 compute-0 nova_compute[189265]: 2025-09-30 07:28:48.734 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:28:48 compute-0 nova_compute[189265]: 2025-09-30 07:28:48.753 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.018s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:28:48 compute-0 nova_compute[189265]: 2025-09-30 07:28:48.753 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5862MB free_disk=73.30374526977539GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:28:48 compute-0 nova_compute[189265]: 2025-09-30 07:28:48.753 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:28:48 compute-0 nova_compute[189265]: 2025-09-30 07:28:48.754 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:28:48 compute-0 podman[218201]: 2025-09-30 07:28:48.771357737 +0000 UTC m=+0.060250369 container create 65df65d723457449d37041eb73dbd6c712a1c63bc36677ed39f06efeb9531692 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Sep 30 07:28:48 compute-0 systemd[1]: Started libpod-conmon-65df65d723457449d37041eb73dbd6c712a1c63bc36677ed39f06efeb9531692.scope.
Sep 30 07:28:48 compute-0 podman[218201]: 2025-09-30 07:28:48.738068043 +0000 UTC m=+0.026960685 image pull eeebcc09bc72f81ab45f5ab87eb8f6a7b554b949227aeec082bdb0732754ddc8 38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 07:28:48 compute-0 systemd[1]: Started libcrun container.
Sep 30 07:28:48 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/89cca207cddb5cbbd568236f47be46be70b370057646a808bd608c63070bfea3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 07:28:48 compute-0 podman[218201]: 2025-09-30 07:28:48.861501091 +0000 UTC m=+0.150393733 container init 65df65d723457449d37041eb73dbd6c712a1c63bc36677ed39f06efeb9531692 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest)
Sep 30 07:28:48 compute-0 podman[218201]: 2025-09-30 07:28:48.871496904 +0000 UTC m=+0.160389526 container start 65df65d723457449d37041eb73dbd6c712a1c63bc36677ed39f06efeb9531692 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930)
Sep 30 07:28:48 compute-0 neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb[218217]: [NOTICE]   (218221) : New worker (218223) forked
Sep 30 07:28:48 compute-0 neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb[218217]: [NOTICE]   (218221) : Loading success.
Sep 30 07:28:49 compute-0 nova_compute[189265]: 2025-09-30 07:28:49.072 2 DEBUG nova.compute.manager [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 07:28:49 compute-0 nova_compute[189265]: 2025-09-30 07:28:49.080 2 DEBUG nova.virt.libvirt.driver [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Sep 30 07:28:49 compute-0 nova_compute[189265]: 2025-09-30 07:28:49.083 2 INFO nova.virt.libvirt.driver [-] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Instance spawned successfully.
Sep 30 07:28:49 compute-0 nova_compute[189265]: 2025-09-30 07:28:49.083 2 DEBUG nova.virt.libvirt.driver [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Sep 30 07:28:49 compute-0 sshd-session[218037]: Received disconnect from 91.224.92.108 port 64620:11:  [preauth]
Sep 30 07:28:49 compute-0 sshd-session[218037]: Disconnected from authenticating user root 91.224.92.108 port 64620 [preauth]
Sep 30 07:28:49 compute-0 sshd-session[218037]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.108  user=root
Sep 30 07:28:49 compute-0 nova_compute[189265]: 2025-09-30 07:28:49.596 2 DEBUG nova.virt.libvirt.driver [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:28:49 compute-0 nova_compute[189265]: 2025-09-30 07:28:49.596 2 DEBUG nova.virt.libvirt.driver [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:28:49 compute-0 nova_compute[189265]: 2025-09-30 07:28:49.597 2 DEBUG nova.virt.libvirt.driver [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:28:49 compute-0 nova_compute[189265]: 2025-09-30 07:28:49.597 2 DEBUG nova.virt.libvirt.driver [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:28:49 compute-0 nova_compute[189265]: 2025-09-30 07:28:49.598 2 DEBUG nova.virt.libvirt.driver [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:28:49 compute-0 nova_compute[189265]: 2025-09-30 07:28:49.598 2 DEBUG nova.virt.libvirt.driver [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:28:49 compute-0 nova_compute[189265]: 2025-09-30 07:28:49.818 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Instance c461f91a-e2a7-4222-a940-d8ab09ea4807 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 07:28:49 compute-0 nova_compute[189265]: 2025-09-30 07:28:49.818 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:28:49 compute-0 nova_compute[189265]: 2025-09-30 07:28:49.819 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:28:48 up  1:26,  0 user,  load average: 0.20, 0.19, 0.31\n', 'num_instances': '1', 'num_vm_building': '1', 'num_task_spawning': '1', 'num_os_type_None': '1', 'num_proj_6431607f3dce4c88bbf6d17ee6cd45b2': '1', 'io_workload': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:28:49 compute-0 nova_compute[189265]: 2025-09-30 07:28:49.852 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:28:50 compute-0 nova_compute[189265]: 2025-09-30 07:28:50.109 2 INFO nova.compute.manager [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Took 11.03 seconds to spawn the instance on the hypervisor.
Sep 30 07:28:50 compute-0 nova_compute[189265]: 2025-09-30 07:28:50.109 2 DEBUG nova.compute.manager [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 07:28:50 compute-0 nova_compute[189265]: 2025-09-30 07:28:50.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:28:50 compute-0 nova_compute[189265]: 2025-09-30 07:28:50.359 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:28:50 compute-0 nova_compute[189265]: 2025-09-30 07:28:50.651 2 INFO nova.compute.manager [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Took 16.28 seconds to build instance.
Sep 30 07:28:50 compute-0 nova_compute[189265]: 2025-09-30 07:28:50.802 2 DEBUG nova.compute.manager [req-5f28d5f0-e03f-464e-84a8-7b0d42b45bae req-c75729ad-023c-4c6a-8d47-720a397a8e8a 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Received event network-vif-plugged-434f2b55-d79b-4459-9ed7-924027ebd4e4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:28:50 compute-0 nova_compute[189265]: 2025-09-30 07:28:50.802 2 DEBUG oslo_concurrency.lockutils [req-5f28d5f0-e03f-464e-84a8-7b0d42b45bae req-c75729ad-023c-4c6a-8d47-720a397a8e8a 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "c461f91a-e2a7-4222-a940-d8ab09ea4807-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:28:50 compute-0 nova_compute[189265]: 2025-09-30 07:28:50.803 2 DEBUG oslo_concurrency.lockutils [req-5f28d5f0-e03f-464e-84a8-7b0d42b45bae req-c75729ad-023c-4c6a-8d47-720a397a8e8a 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "c461f91a-e2a7-4222-a940-d8ab09ea4807-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:28:50 compute-0 nova_compute[189265]: 2025-09-30 07:28:50.803 2 DEBUG oslo_concurrency.lockutils [req-5f28d5f0-e03f-464e-84a8-7b0d42b45bae req-c75729ad-023c-4c6a-8d47-720a397a8e8a 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "c461f91a-e2a7-4222-a940-d8ab09ea4807-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:28:50 compute-0 nova_compute[189265]: 2025-09-30 07:28:50.803 2 DEBUG nova.compute.manager [req-5f28d5f0-e03f-464e-84a8-7b0d42b45bae req-c75729ad-023c-4c6a-8d47-720a397a8e8a 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] No waiting events found dispatching network-vif-plugged-434f2b55-d79b-4459-9ed7-924027ebd4e4 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:28:50 compute-0 nova_compute[189265]: 2025-09-30 07:28:50.803 2 WARNING nova.compute.manager [req-5f28d5f0-e03f-464e-84a8-7b0d42b45bae req-c75729ad-023c-4c6a-8d47-720a397a8e8a 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Received unexpected event network-vif-plugged-434f2b55-d79b-4459-9ed7-924027ebd4e4 for instance with vm_state active and task_state None.
Sep 30 07:28:50 compute-0 nova_compute[189265]: 2025-09-30 07:28:50.871 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:28:50 compute-0 nova_compute[189265]: 2025-09-30 07:28:50.871 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.117s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:28:50 compute-0 nova_compute[189265]: 2025-09-30 07:28:50.871 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:28:51 compute-0 nova_compute[189265]: 2025-09-30 07:28:51.159 2 DEBUG oslo_concurrency.lockutils [None req-f3dc7a36-7801-422f-a861-6dceffe9b7ad 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "c461f91a-e2a7-4222-a940-d8ab09ea4807" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.810s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:28:53 compute-0 nova_compute[189265]: 2025-09-30 07:28:53.331 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:28:54 compute-0 nova_compute[189265]: 2025-09-30 07:28:54.378 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:28:54 compute-0 nova_compute[189265]: 2025-09-30 07:28:54.378 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:28:54 compute-0 nova_compute[189265]: 2025-09-30 07:28:54.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:28:55 compute-0 nova_compute[189265]: 2025-09-30 07:28:55.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:28:55 compute-0 podman[218232]: 2025-09-30 07:28:55.503227414 +0000 UTC m=+0.082989403 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 07:28:58 compute-0 nova_compute[189265]: 2025-09-30 07:28:58.332 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:28:59 compute-0 podman[199733]: time="2025-09-30T07:28:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:28:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:28:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 07:28:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:28:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3472 "" "Go-http-client/1.1"
Sep 30 07:28:59 compute-0 nova_compute[189265]: 2025-09-30 07:28:59.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:28:59 compute-0 nova_compute[189265]: 2025-09-30 07:28:59.788 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Sep 30 07:29:00 compute-0 ovn_controller[91436]: 2025-09-30T07:29:00Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:9a:b9:c8 10.100.0.6
Sep 30 07:29:00 compute-0 ovn_controller[91436]: 2025-09-30T07:29:00Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9a:b9:c8 10.100.0.6
Sep 30 07:29:00 compute-0 nova_compute[189265]: 2025-09-30 07:29:00.297 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Sep 30 07:29:00 compute-0 nova_compute[189265]: 2025-09-30 07:29:00.360 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:29:01 compute-0 openstack_network_exporter[201859]: ERROR   07:29:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:29:01 compute-0 openstack_network_exporter[201859]: ERROR   07:29:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:29:01 compute-0 openstack_network_exporter[201859]: ERROR   07:29:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:29:01 compute-0 openstack_network_exporter[201859]: ERROR   07:29:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:29:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:29:01 compute-0 openstack_network_exporter[201859]: ERROR   07:29:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:29:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:29:01 compute-0 nova_compute[189265]: 2025-09-30 07:29:01.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:29:01 compute-0 nova_compute[189265]: 2025-09-30 07:29:01.789 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Sep 30 07:29:03 compute-0 nova_compute[189265]: 2025-09-30 07:29:03.335 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:29:04 compute-0 podman[218276]: 2025-09-30 07:29:04.519505829 +0000 UTC m=+0.092383418 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Sep 30 07:29:05 compute-0 nova_compute[189265]: 2025-09-30 07:29:05.366 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:29:07 compute-0 podman[218297]: 2025-09-30 07:29:07.492285372 +0000 UTC m=+0.070268532 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Sep 30 07:29:08 compute-0 nova_compute[189265]: 2025-09-30 07:29:08.338 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:29:10 compute-0 nova_compute[189265]: 2025-09-30 07:29:10.428 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:29:11 compute-0 podman[218318]: 2025-09-30 07:29:11.527152021 +0000 UTC m=+0.094955752 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Sep 30 07:29:11 compute-0 podman[218319]: 2025-09-30 07:29:11.537170215 +0000 UTC m=+0.097634688 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.4)
Sep 30 07:29:11 compute-0 podman[218320]: 2025-09-30 07:29:11.620073664 +0000 UTC m=+0.184339095 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20250930)
Sep 30 07:29:13 compute-0 nova_compute[189265]: 2025-09-30 07:29:13.341 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:29:15 compute-0 nova_compute[189265]: 2025-09-30 07:29:15.432 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:29:15 compute-0 nova_compute[189265]: 2025-09-30 07:29:15.439 2 DEBUG nova.compute.manager [None req-91abf66e-aff9-49a7-8e74-1ade64b84f21 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:635
Sep 30 07:29:15 compute-0 nova_compute[189265]: 2025-09-30 07:29:15.493 2 DEBUG nova.compute.provider_tree [None req-91abf66e-aff9-49a7-8e74-1ade64b84f21 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Updating resource provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc generation from 15 to 19 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Sep 30 07:29:18 compute-0 ovn_controller[91436]: 2025-09-30T07:29:18Z|00156|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Sep 30 07:29:18 compute-0 nova_compute[189265]: 2025-09-30 07:29:18.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:29:20 compute-0 nova_compute[189265]: 2025-09-30 07:29:20.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:29:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:29:20.559 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:29:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:29:20.559 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:29:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:29:20.559 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:29:23 compute-0 nova_compute[189265]: 2025-09-30 07:29:23.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:29:23 compute-0 nova_compute[189265]: 2025-09-30 07:29:23.437 2 DEBUG nova.virt.libvirt.driver [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Check if temp file /var/lib/nova/instances/tmp8hldmcu3 exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Sep 30 07:29:23 compute-0 nova_compute[189265]: 2025-09-30 07:29:23.441 2 DEBUG nova.compute.manager [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp8hldmcu3',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c461f91a-e2a7-4222-a940-d8ab09ea4807',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9294
Sep 30 07:29:25 compute-0 nova_compute[189265]: 2025-09-30 07:29:25.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:29:26 compute-0 podman[218381]: 2025-09-30 07:29:26.45535607 +0000 UTC m=+0.045493110 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 07:29:27 compute-0 nova_compute[189265]: 2025-09-30 07:29:27.774 2 DEBUG oslo_concurrency.processutils [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c461f91a-e2a7-4222-a940-d8ab09ea4807/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:29:27 compute-0 nova_compute[189265]: 2025-09-30 07:29:27.847 2 DEBUG oslo_concurrency.processutils [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c461f91a-e2a7-4222-a940-d8ab09ea4807/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:29:27 compute-0 nova_compute[189265]: 2025-09-30 07:29:27.849 2 DEBUG oslo_concurrency.processutils [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c461f91a-e2a7-4222-a940-d8ab09ea4807/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:29:27 compute-0 nova_compute[189265]: 2025-09-30 07:29:27.925 2 DEBUG oslo_concurrency.processutils [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c461f91a-e2a7-4222-a940-d8ab09ea4807/disk --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:29:27 compute-0 nova_compute[189265]: 2025-09-30 07:29:27.927 2 DEBUG nova.compute.manager [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Preparing to wait for external event network-vif-plugged-434f2b55-d79b-4459-9ed7-924027ebd4e4 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 07:29:27 compute-0 nova_compute[189265]: 2025-09-30 07:29:27.928 2 DEBUG oslo_concurrency.lockutils [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "c461f91a-e2a7-4222-a940-d8ab09ea4807-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:29:27 compute-0 nova_compute[189265]: 2025-09-30 07:29:27.928 2 DEBUG oslo_concurrency.lockutils [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "c461f91a-e2a7-4222-a940-d8ab09ea4807-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:29:27 compute-0 nova_compute[189265]: 2025-09-30 07:29:27.929 2 DEBUG oslo_concurrency.lockutils [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "c461f91a-e2a7-4222-a940-d8ab09ea4807-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:29:28 compute-0 nova_compute[189265]: 2025-09-30 07:29:28.349 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:29:29 compute-0 podman[199733]: time="2025-09-30T07:29:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:29:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:29:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 07:29:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:29:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3473 "" "Go-http-client/1.1"
Sep 30 07:29:30 compute-0 nova_compute[189265]: 2025-09-30 07:29:30.479 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:29:31 compute-0 openstack_network_exporter[201859]: ERROR   07:29:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:29:31 compute-0 openstack_network_exporter[201859]: ERROR   07:29:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:29:31 compute-0 openstack_network_exporter[201859]: ERROR   07:29:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:29:31 compute-0 openstack_network_exporter[201859]: ERROR   07:29:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:29:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:29:31 compute-0 openstack_network_exporter[201859]: ERROR   07:29:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:29:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:29:33 compute-0 nova_compute[189265]: 2025-09-30 07:29:33.351 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:29:34 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:29:34.547 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '1a:26:7c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2e:60:fa:91:d0:34'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:29:34 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:29:34.547 100322 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 07:29:34 compute-0 nova_compute[189265]: 2025-09-30 07:29:34.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:29:34 compute-0 nova_compute[189265]: 2025-09-30 07:29:34.608 2 DEBUG nova.compute.manager [req-43ee0433-eb44-484e-8217-9bf0a21c2660 req-6b5c4168-6333-43ef-bb75-bc04b6996d8c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Received event network-vif-unplugged-434f2b55-d79b-4459-9ed7-924027ebd4e4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:29:34 compute-0 nova_compute[189265]: 2025-09-30 07:29:34.609 2 DEBUG oslo_concurrency.lockutils [req-43ee0433-eb44-484e-8217-9bf0a21c2660 req-6b5c4168-6333-43ef-bb75-bc04b6996d8c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "c461f91a-e2a7-4222-a940-d8ab09ea4807-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:29:34 compute-0 nova_compute[189265]: 2025-09-30 07:29:34.609 2 DEBUG oslo_concurrency.lockutils [req-43ee0433-eb44-484e-8217-9bf0a21c2660 req-6b5c4168-6333-43ef-bb75-bc04b6996d8c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "c461f91a-e2a7-4222-a940-d8ab09ea4807-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:29:34 compute-0 nova_compute[189265]: 2025-09-30 07:29:34.610 2 DEBUG oslo_concurrency.lockutils [req-43ee0433-eb44-484e-8217-9bf0a21c2660 req-6b5c4168-6333-43ef-bb75-bc04b6996d8c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "c461f91a-e2a7-4222-a940-d8ab09ea4807-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:29:34 compute-0 nova_compute[189265]: 2025-09-30 07:29:34.610 2 DEBUG nova.compute.manager [req-43ee0433-eb44-484e-8217-9bf0a21c2660 req-6b5c4168-6333-43ef-bb75-bc04b6996d8c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] No event matching network-vif-unplugged-434f2b55-d79b-4459-9ed7-924027ebd4e4 in dict_keys([('network-vif-plugged', '434f2b55-d79b-4459-9ed7-924027ebd4e4')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:349
Sep 30 07:29:34 compute-0 nova_compute[189265]: 2025-09-30 07:29:34.611 2 DEBUG nova.compute.manager [req-43ee0433-eb44-484e-8217-9bf0a21c2660 req-6b5c4168-6333-43ef-bb75-bc04b6996d8c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Received event network-vif-unplugged-434f2b55-d79b-4459-9ed7-924027ebd4e4 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:29:35 compute-0 nova_compute[189265]: 2025-09-30 07:29:35.480 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:29:35 compute-0 podman[218412]: 2025-09-30 07:29:35.511733539 +0000 UTC m=+0.090145575 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Sep 30 07:29:36 compute-0 nova_compute[189265]: 2025-09-30 07:29:36.455 2 INFO nova.compute.manager [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Took 8.53 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Sep 30 07:29:36 compute-0 nova_compute[189265]: 2025-09-30 07:29:36.687 2 DEBUG nova.compute.manager [req-f7274c4c-34f1-4caf-82f2-318c34edb515 req-1ff7faa2-4778-4357-ac4f-1dbf5260fbe6 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Received event network-vif-plugged-434f2b55-d79b-4459-9ed7-924027ebd4e4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:29:36 compute-0 nova_compute[189265]: 2025-09-30 07:29:36.688 2 DEBUG oslo_concurrency.lockutils [req-f7274c4c-34f1-4caf-82f2-318c34edb515 req-1ff7faa2-4778-4357-ac4f-1dbf5260fbe6 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "c461f91a-e2a7-4222-a940-d8ab09ea4807-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:29:36 compute-0 nova_compute[189265]: 2025-09-30 07:29:36.688 2 DEBUG oslo_concurrency.lockutils [req-f7274c4c-34f1-4caf-82f2-318c34edb515 req-1ff7faa2-4778-4357-ac4f-1dbf5260fbe6 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "c461f91a-e2a7-4222-a940-d8ab09ea4807-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:29:36 compute-0 nova_compute[189265]: 2025-09-30 07:29:36.689 2 DEBUG oslo_concurrency.lockutils [req-f7274c4c-34f1-4caf-82f2-318c34edb515 req-1ff7faa2-4778-4357-ac4f-1dbf5260fbe6 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "c461f91a-e2a7-4222-a940-d8ab09ea4807-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:29:36 compute-0 nova_compute[189265]: 2025-09-30 07:29:36.689 2 DEBUG nova.compute.manager [req-f7274c4c-34f1-4caf-82f2-318c34edb515 req-1ff7faa2-4778-4357-ac4f-1dbf5260fbe6 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Processing event network-vif-plugged-434f2b55-d79b-4459-9ed7-924027ebd4e4 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 07:29:36 compute-0 nova_compute[189265]: 2025-09-30 07:29:36.689 2 DEBUG nova.compute.manager [req-f7274c4c-34f1-4caf-82f2-318c34edb515 req-1ff7faa2-4778-4357-ac4f-1dbf5260fbe6 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Received event network-changed-434f2b55-d79b-4459-9ed7-924027ebd4e4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:29:36 compute-0 nova_compute[189265]: 2025-09-30 07:29:36.689 2 DEBUG nova.compute.manager [req-f7274c4c-34f1-4caf-82f2-318c34edb515 req-1ff7faa2-4778-4357-ac4f-1dbf5260fbe6 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Refreshing instance network info cache due to event network-changed-434f2b55-d79b-4459-9ed7-924027ebd4e4. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 07:29:36 compute-0 nova_compute[189265]: 2025-09-30 07:29:36.690 2 DEBUG oslo_concurrency.lockutils [req-f7274c4c-34f1-4caf-82f2-318c34edb515 req-1ff7faa2-4778-4357-ac4f-1dbf5260fbe6 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "refresh_cache-c461f91a-e2a7-4222-a940-d8ab09ea4807" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:29:36 compute-0 nova_compute[189265]: 2025-09-30 07:29:36.690 2 DEBUG oslo_concurrency.lockutils [req-f7274c4c-34f1-4caf-82f2-318c34edb515 req-1ff7faa2-4778-4357-ac4f-1dbf5260fbe6 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquired lock "refresh_cache-c461f91a-e2a7-4222-a940-d8ab09ea4807" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:29:36 compute-0 nova_compute[189265]: 2025-09-30 07:29:36.690 2 DEBUG nova.network.neutron [req-f7274c4c-34f1-4caf-82f2-318c34edb515 req-1ff7faa2-4778-4357-ac4f-1dbf5260fbe6 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Refreshing network info cache for port 434f2b55-d79b-4459-9ed7-924027ebd4e4 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 07:29:36 compute-0 nova_compute[189265]: 2025-09-30 07:29:36.692 2 DEBUG nova.compute.manager [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 07:29:37 compute-0 nova_compute[189265]: 2025-09-30 07:29:37.199 2 DEBUG nova.compute.manager [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp8hldmcu3',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c461f91a-e2a7-4222-a940-d8ab09ea4807',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(c519074c-77cb-4d2d-a9a3-bb6c93d56bdd),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9659
Sep 30 07:29:37 compute-0 nova_compute[189265]: 2025-09-30 07:29:37.209 2 WARNING neutronclient.v2_0.client [req-f7274c4c-34f1-4caf-82f2-318c34edb515 req-1ff7faa2-4778-4357-ac4f-1dbf5260fbe6 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:29:37 compute-0 nova_compute[189265]: 2025-09-30 07:29:37.713 2 DEBUG nova.objects.instance [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lazy-loading 'migration_context' on Instance uuid c461f91a-e2a7-4222-a940-d8ab09ea4807 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:29:37 compute-0 nova_compute[189265]: 2025-09-30 07:29:37.714 2 DEBUG nova.virt.libvirt.driver [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Sep 30 07:29:37 compute-0 nova_compute[189265]: 2025-09-30 07:29:37.715 2 DEBUG nova.virt.libvirt.driver [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Sep 30 07:29:37 compute-0 nova_compute[189265]: 2025-09-30 07:29:37.715 2 DEBUG nova.virt.libvirt.driver [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Sep 30 07:29:37 compute-0 nova_compute[189265]: 2025-09-30 07:29:37.899 2 WARNING neutronclient.v2_0.client [req-f7274c4c-34f1-4caf-82f2-318c34edb515 req-1ff7faa2-4778-4357-ac4f-1dbf5260fbe6 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:29:38 compute-0 nova_compute[189265]: 2025-09-30 07:29:38.217 2 DEBUG nova.virt.libvirt.driver [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Sep 30 07:29:38 compute-0 nova_compute[189265]: 2025-09-30 07:29:38.218 2 DEBUG nova.virt.libvirt.driver [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Sep 30 07:29:38 compute-0 nova_compute[189265]: 2025-09-30 07:29:38.225 2 DEBUG nova.virt.libvirt.vif [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-09-30T07:28:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1498059428',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1498059428',id=15,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T07:28:50Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6431607f3dce4c88bbf6d17ee6cd45b2',ramdisk_id='',reservation_id='r-1pekgb53',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1096120513',owner_user_name='tempest-TestExecuteStrategies-1096120513-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T07:28:50Z,user_data=None,user_id='89ba5d19014145188ad2a3c812acdc88',uuid=c461f91a-e2a7-4222-a940-d8ab09ea4807,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "434f2b55-d79b-4459-9ed7-924027ebd4e4", "address": "fa:16:3e:9a:b9:c8", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap434f2b55-d7", "ovs_interfaceid": "434f2b55-d79b-4459-9ed7-924027ebd4e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 07:29:38 compute-0 nova_compute[189265]: 2025-09-30 07:29:38.225 2 DEBUG nova.network.os_vif_util [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Converting VIF {"id": "434f2b55-d79b-4459-9ed7-924027ebd4e4", "address": "fa:16:3e:9a:b9:c8", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap434f2b55-d7", "ovs_interfaceid": "434f2b55-d79b-4459-9ed7-924027ebd4e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:29:38 compute-0 nova_compute[189265]: 2025-09-30 07:29:38.226 2 DEBUG nova.network.os_vif_util [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9a:b9:c8,bridge_name='br-int',has_traffic_filtering=True,id=434f2b55-d79b-4459-9ed7-924027ebd4e4,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap434f2b55-d7') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:29:38 compute-0 nova_compute[189265]: 2025-09-30 07:29:38.227 2 DEBUG nova.virt.libvirt.migration [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Updating guest XML with vif config: <interface type="ethernet">
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <mac address="fa:16:3e:9a:b9:c8"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <model type="virtio"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <driver name="vhost" rx_queue_size="512"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <mtu size="1442"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <target dev="tap434f2b55-d7"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]: </interface>
Sep 30 07:29:38 compute-0 nova_compute[189265]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Sep 30 07:29:38 compute-0 nova_compute[189265]: 2025-09-30 07:29:38.228 2 DEBUG nova.virt.libvirt.migration [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <name>instance-0000000f</name>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <uuid>c461f91a-e2a7-4222-a940-d8ab09ea4807</uuid>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <metadata>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <nova:name>tempest-TestExecuteStrategies-server-1498059428</nova:name>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <nova:creationTime>2025-09-30 07:28:44</nova:creationTime>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <nova:flavor name="m1.nano" id="ded17455-f8fe-40c7-8dae-6f0a2b208ae0">
Sep 30 07:29:38 compute-0 nova_compute[189265]:         <nova:memory>128</nova:memory>
Sep 30 07:29:38 compute-0 nova_compute[189265]:         <nova:disk>1</nova:disk>
Sep 30 07:29:38 compute-0 nova_compute[189265]:         <nova:swap>0</nova:swap>
Sep 30 07:29:38 compute-0 nova_compute[189265]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 07:29:38 compute-0 nova_compute[189265]:         <nova:vcpus>1</nova:vcpus>
Sep 30 07:29:38 compute-0 nova_compute[189265]:         <nova:extraSpecs>
Sep 30 07:29:38 compute-0 nova_compute[189265]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 07:29:38 compute-0 nova_compute[189265]:         </nova:extraSpecs>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       </nova:flavor>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <nova:image uuid="0c6b92f5-9861-49e4-862d-3ffd84520dfa">
Sep 30 07:29:38 compute-0 nova_compute[189265]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 07:29:38 compute-0 nova_compute[189265]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 07:29:38 compute-0 nova_compute[189265]:         <nova:minDisk>1</nova:minDisk>
Sep 30 07:29:38 compute-0 nova_compute[189265]:         <nova:minRam>0</nova:minRam>
Sep 30 07:29:38 compute-0 nova_compute[189265]:         <nova:properties>
Sep 30 07:29:38 compute-0 nova_compute[189265]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 07:29:38 compute-0 nova_compute[189265]:         </nova:properties>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       </nova:image>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <nova:owner>
Sep 30 07:29:38 compute-0 nova_compute[189265]:         <nova:user uuid="89ba5d19014145188ad2a3c812acdc88">tempest-TestExecuteStrategies-1096120513-project-admin</nova:user>
Sep 30 07:29:38 compute-0 nova_compute[189265]:         <nova:project uuid="6431607f3dce4c88bbf6d17ee6cd45b2">tempest-TestExecuteStrategies-1096120513</nova:project>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       </nova:owner>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <nova:root type="image" uuid="0c6b92f5-9861-49e4-862d-3ffd84520dfa"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <nova:ports>
Sep 30 07:29:38 compute-0 nova_compute[189265]:         <nova:port uuid="434f2b55-d79b-4459-9ed7-924027ebd4e4">
Sep 30 07:29:38 compute-0 nova_compute[189265]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:         </nova:port>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       </nova:ports>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </nova:instance>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   </metadata>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <memory unit="KiB">131072</memory>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <currentMemory unit="KiB">131072</currentMemory>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <vcpu placement="static">1</vcpu>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <resource>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <partition>/machine</partition>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   </resource>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <sysinfo type="smbios">
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <system>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <entry name="manufacturer">RDO</entry>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <entry name="product">OpenStack Compute</entry>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <entry name="serial">c461f91a-e2a7-4222-a940-d8ab09ea4807</entry>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <entry name="uuid">c461f91a-e2a7-4222-a940-d8ab09ea4807</entry>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <entry name="family">Virtual Machine</entry>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </system>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   </sysinfo>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <os>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <boot dev="hd"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <smbios mode="sysinfo"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   </os>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <features>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <acpi/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <apic/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <vmcoreinfo state="on"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   </features>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <cpu mode="host-model" check="partial">
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   </cpu>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <clock offset="utc">
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <timer name="hpet" present="no"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   </clock>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <on_poweroff>destroy</on_poweroff>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <on_reboot>restart</on_reboot>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <on_crash>destroy</on_crash>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <devices>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <disk type="file" device="disk">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <source file="/var/lib/nova/instances/c461f91a-e2a7-4222-a940-d8ab09ea4807/disk"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target dev="vda" bus="virtio"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </disk>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <disk type="file" device="cdrom">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <source file="/var/lib/nova/instances/c461f91a-e2a7-4222-a940-d8ab09ea4807/disk.config"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target dev="sda" bus="sata"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <readonly/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </disk>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="1" port="0x10"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="2" port="0x11"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="3" port="0x12"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="4" port="0x13"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="5" port="0x14"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="6" port="0x15"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="7" port="0x16"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="8" port="0x17"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="9" port="0x18"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="10" port="0x19"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="11" port="0x1a"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="12" port="0x1b"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="13" port="0x1c"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="14" port="0x1d"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="15" port="0x1e"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="16" port="0x1f"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="17" port="0x20"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="18" port="0x21"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="19" port="0x22"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="20" port="0x23"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="21" port="0x24"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="22" port="0x25"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="23" port="0x26"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="24" port="0x27"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="25" port="0x28"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-pci-bridge"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="sata" index="0">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <interface type="ethernet"><mac address="fa:16:3e:9a:b9:c8"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap434f2b55-d7"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </interface><serial type="pty">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <log file="/var/lib/nova/instances/c461f91a-e2a7-4222-a940-d8ab09ea4807/console.log" append="off"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target type="isa-serial" port="0">
Sep 30 07:29:38 compute-0 nova_compute[189265]:         <model name="isa-serial"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       </target>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </serial>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <console type="pty">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <log file="/var/lib/nova/instances/c461f91a-e2a7-4222-a940-d8ab09ea4807/console.log" append="off"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target type="serial" port="0"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </console>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <input type="tablet" bus="usb">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="usb" bus="0" port="1"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </input>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <input type="mouse" bus="ps2"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <listen type="address" address="::"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </graphics>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <video>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </video>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <stats period="10"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </memballoon>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <rng model="virtio">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <backend model="random">/dev/urandom</backend>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </rng>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   </devices>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]: </domain>
Sep 30 07:29:38 compute-0 nova_compute[189265]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Sep 30 07:29:38 compute-0 nova_compute[189265]: 2025-09-30 07:29:38.229 2 DEBUG nova.virt.libvirt.migration [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <name>instance-0000000f</name>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <uuid>c461f91a-e2a7-4222-a940-d8ab09ea4807</uuid>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <metadata>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <nova:name>tempest-TestExecuteStrategies-server-1498059428</nova:name>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <nova:creationTime>2025-09-30 07:28:44</nova:creationTime>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <nova:flavor name="m1.nano" id="ded17455-f8fe-40c7-8dae-6f0a2b208ae0">
Sep 30 07:29:38 compute-0 nova_compute[189265]:         <nova:memory>128</nova:memory>
Sep 30 07:29:38 compute-0 nova_compute[189265]:         <nova:disk>1</nova:disk>
Sep 30 07:29:38 compute-0 nova_compute[189265]:         <nova:swap>0</nova:swap>
Sep 30 07:29:38 compute-0 nova_compute[189265]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 07:29:38 compute-0 nova_compute[189265]:         <nova:vcpus>1</nova:vcpus>
Sep 30 07:29:38 compute-0 nova_compute[189265]:         <nova:extraSpecs>
Sep 30 07:29:38 compute-0 nova_compute[189265]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 07:29:38 compute-0 nova_compute[189265]:         </nova:extraSpecs>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       </nova:flavor>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <nova:image uuid="0c6b92f5-9861-49e4-862d-3ffd84520dfa">
Sep 30 07:29:38 compute-0 nova_compute[189265]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 07:29:38 compute-0 nova_compute[189265]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 07:29:38 compute-0 nova_compute[189265]:         <nova:minDisk>1</nova:minDisk>
Sep 30 07:29:38 compute-0 nova_compute[189265]:         <nova:minRam>0</nova:minRam>
Sep 30 07:29:38 compute-0 nova_compute[189265]:         <nova:properties>
Sep 30 07:29:38 compute-0 nova_compute[189265]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 07:29:38 compute-0 nova_compute[189265]:         </nova:properties>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       </nova:image>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <nova:owner>
Sep 30 07:29:38 compute-0 nova_compute[189265]:         <nova:user uuid="89ba5d19014145188ad2a3c812acdc88">tempest-TestExecuteStrategies-1096120513-project-admin</nova:user>
Sep 30 07:29:38 compute-0 nova_compute[189265]:         <nova:project uuid="6431607f3dce4c88bbf6d17ee6cd45b2">tempest-TestExecuteStrategies-1096120513</nova:project>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       </nova:owner>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <nova:root type="image" uuid="0c6b92f5-9861-49e4-862d-3ffd84520dfa"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <nova:ports>
Sep 30 07:29:38 compute-0 nova_compute[189265]:         <nova:port uuid="434f2b55-d79b-4459-9ed7-924027ebd4e4">
Sep 30 07:29:38 compute-0 nova_compute[189265]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:         </nova:port>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       </nova:ports>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </nova:instance>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   </metadata>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <memory unit="KiB">131072</memory>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <currentMemory unit="KiB">131072</currentMemory>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <vcpu placement="static">1</vcpu>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <resource>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <partition>/machine</partition>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   </resource>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <sysinfo type="smbios">
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <system>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <entry name="manufacturer">RDO</entry>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <entry name="product">OpenStack Compute</entry>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <entry name="serial">c461f91a-e2a7-4222-a940-d8ab09ea4807</entry>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <entry name="uuid">c461f91a-e2a7-4222-a940-d8ab09ea4807</entry>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <entry name="family">Virtual Machine</entry>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </system>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   </sysinfo>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <os>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <boot dev="hd"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <smbios mode="sysinfo"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   </os>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <features>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <acpi/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <apic/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <vmcoreinfo state="on"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   </features>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <cpu mode="host-model" check="partial">
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   </cpu>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <clock offset="utc">
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <timer name="hpet" present="no"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   </clock>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <on_poweroff>destroy</on_poweroff>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <on_reboot>restart</on_reboot>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <on_crash>destroy</on_crash>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <devices>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <disk type="file" device="disk">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <source file="/var/lib/nova/instances/c461f91a-e2a7-4222-a940-d8ab09ea4807/disk"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target dev="vda" bus="virtio"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </disk>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <disk type="file" device="cdrom">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <source file="/var/lib/nova/instances/c461f91a-e2a7-4222-a940-d8ab09ea4807/disk.config"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target dev="sda" bus="sata"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <readonly/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </disk>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="1" port="0x10"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="2" port="0x11"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="3" port="0x12"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="4" port="0x13"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="5" port="0x14"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="6" port="0x15"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="7" port="0x16"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="8" port="0x17"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="9" port="0x18"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="10" port="0x19"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="11" port="0x1a"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="12" port="0x1b"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="13" port="0x1c"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="14" port="0x1d"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="15" port="0x1e"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="16" port="0x1f"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="17" port="0x20"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="18" port="0x21"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="19" port="0x22"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="20" port="0x23"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="21" port="0x24"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="22" port="0x25"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="23" port="0x26"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="24" port="0x27"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="25" port="0x28"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-pci-bridge"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="sata" index="0">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <interface type="ethernet"><mac address="fa:16:3e:9a:b9:c8"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap434f2b55-d7"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </interface><serial type="pty">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <log file="/var/lib/nova/instances/c461f91a-e2a7-4222-a940-d8ab09ea4807/console.log" append="off"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target type="isa-serial" port="0">
Sep 30 07:29:38 compute-0 nova_compute[189265]:         <model name="isa-serial"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       </target>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </serial>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <console type="pty">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <log file="/var/lib/nova/instances/c461f91a-e2a7-4222-a940-d8ab09ea4807/console.log" append="off"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target type="serial" port="0"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </console>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <input type="tablet" bus="usb">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="usb" bus="0" port="1"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </input>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <input type="mouse" bus="ps2"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <listen type="address" address="::"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </graphics>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <video>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </video>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <stats period="10"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </memballoon>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <rng model="virtio">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <backend model="random">/dev/urandom</backend>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </rng>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   </devices>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]: </domain>
Sep 30 07:29:38 compute-0 nova_compute[189265]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Sep 30 07:29:38 compute-0 nova_compute[189265]: 2025-09-30 07:29:38.230 2 DEBUG nova.virt.libvirt.migration [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] _update_pci_xml output xml=<domain type="kvm">
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <name>instance-0000000f</name>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <uuid>c461f91a-e2a7-4222-a940-d8ab09ea4807</uuid>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <metadata>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <nova:name>tempest-TestExecuteStrategies-server-1498059428</nova:name>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <nova:creationTime>2025-09-30 07:28:44</nova:creationTime>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <nova:flavor name="m1.nano" id="ded17455-f8fe-40c7-8dae-6f0a2b208ae0">
Sep 30 07:29:38 compute-0 nova_compute[189265]:         <nova:memory>128</nova:memory>
Sep 30 07:29:38 compute-0 nova_compute[189265]:         <nova:disk>1</nova:disk>
Sep 30 07:29:38 compute-0 nova_compute[189265]:         <nova:swap>0</nova:swap>
Sep 30 07:29:38 compute-0 nova_compute[189265]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 07:29:38 compute-0 nova_compute[189265]:         <nova:vcpus>1</nova:vcpus>
Sep 30 07:29:38 compute-0 nova_compute[189265]:         <nova:extraSpecs>
Sep 30 07:29:38 compute-0 nova_compute[189265]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 07:29:38 compute-0 nova_compute[189265]:         </nova:extraSpecs>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       </nova:flavor>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <nova:image uuid="0c6b92f5-9861-49e4-862d-3ffd84520dfa">
Sep 30 07:29:38 compute-0 nova_compute[189265]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 07:29:38 compute-0 nova_compute[189265]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 07:29:38 compute-0 nova_compute[189265]:         <nova:minDisk>1</nova:minDisk>
Sep 30 07:29:38 compute-0 nova_compute[189265]:         <nova:minRam>0</nova:minRam>
Sep 30 07:29:38 compute-0 nova_compute[189265]:         <nova:properties>
Sep 30 07:29:38 compute-0 nova_compute[189265]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 07:29:38 compute-0 nova_compute[189265]:         </nova:properties>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       </nova:image>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <nova:owner>
Sep 30 07:29:38 compute-0 nova_compute[189265]:         <nova:user uuid="89ba5d19014145188ad2a3c812acdc88">tempest-TestExecuteStrategies-1096120513-project-admin</nova:user>
Sep 30 07:29:38 compute-0 nova_compute[189265]:         <nova:project uuid="6431607f3dce4c88bbf6d17ee6cd45b2">tempest-TestExecuteStrategies-1096120513</nova:project>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       </nova:owner>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <nova:root type="image" uuid="0c6b92f5-9861-49e4-862d-3ffd84520dfa"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <nova:ports>
Sep 30 07:29:38 compute-0 nova_compute[189265]:         <nova:port uuid="434f2b55-d79b-4459-9ed7-924027ebd4e4">
Sep 30 07:29:38 compute-0 nova_compute[189265]:           <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:         </nova:port>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       </nova:ports>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </nova:instance>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   </metadata>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <memory unit="KiB">131072</memory>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <currentMemory unit="KiB">131072</currentMemory>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <vcpu placement="static">1</vcpu>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <resource>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <partition>/machine</partition>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   </resource>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <sysinfo type="smbios">
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <system>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <entry name="manufacturer">RDO</entry>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <entry name="product">OpenStack Compute</entry>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <entry name="serial">c461f91a-e2a7-4222-a940-d8ab09ea4807</entry>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <entry name="uuid">c461f91a-e2a7-4222-a940-d8ab09ea4807</entry>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <entry name="family">Virtual Machine</entry>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </system>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   </sysinfo>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <os>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <boot dev="hd"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <smbios mode="sysinfo"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   </os>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <features>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <acpi/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <apic/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <vmcoreinfo state="on"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   </features>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <cpu mode="host-model" check="partial">
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   </cpu>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <clock offset="utc">
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <timer name="hpet" present="no"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   </clock>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <on_poweroff>destroy</on_poweroff>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <on_reboot>restart</on_reboot>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <on_crash>destroy</on_crash>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <devices>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <disk type="file" device="disk">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <source file="/var/lib/nova/instances/c461f91a-e2a7-4222-a940-d8ab09ea4807/disk"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target dev="vda" bus="virtio"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </disk>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <disk type="file" device="cdrom">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <source file="/var/lib/nova/instances/c461f91a-e2a7-4222-a940-d8ab09ea4807/disk.config"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target dev="sda" bus="sata"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <readonly/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </disk>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="1" port="0x10"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="2" port="0x11"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="3" port="0x12"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="4" port="0x13"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="5" port="0x14"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="6" port="0x15"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="7" port="0x16"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="8" port="0x17"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="9" port="0x18"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="10" port="0x19"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="11" port="0x1a"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="12" port="0x1b"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="13" port="0x1c"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="14" port="0x1d"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="15" port="0x1e"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="16" port="0x1f"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="17" port="0x20"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="18" port="0x21"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="19" port="0x22"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="20" port="0x23"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="21" port="0x24"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="22" port="0x25"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="23" port="0x26"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="24" port="0x27"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target chassis="25" port="0x28"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model name="pcie-pci-bridge"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <controller type="sata" index="0">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <interface type="ethernet"><mac address="fa:16:3e:9a:b9:c8"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap434f2b55-d7"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </interface><serial type="pty">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <log file="/var/lib/nova/instances/c461f91a-e2a7-4222-a940-d8ab09ea4807/console.log" append="off"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target type="isa-serial" port="0">
Sep 30 07:29:38 compute-0 nova_compute[189265]:         <model name="isa-serial"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       </target>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </serial>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <console type="pty">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <log file="/var/lib/nova/instances/c461f91a-e2a7-4222-a940-d8ab09ea4807/console.log" append="off"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <target type="serial" port="0"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </console>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <input type="tablet" bus="usb">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="usb" bus="0" port="1"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </input>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <input type="mouse" bus="ps2"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <listen type="address" address="::"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </graphics>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <video>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </video>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <stats period="10"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </memballoon>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     <rng model="virtio">
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <backend model="random">/dev/urandom</backend>
Sep 30 07:29:38 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]:     </rng>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   </devices>
Sep 30 07:29:38 compute-0 nova_compute[189265]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 07:29:38 compute-0 nova_compute[189265]: </domain>
Sep 30 07:29:38 compute-0 nova_compute[189265]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Sep 30 07:29:38 compute-0 nova_compute[189265]: 2025-09-30 07:29:38.231 2 DEBUG nova.virt.libvirt.driver [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Sep 30 07:29:38 compute-0 nova_compute[189265]: 2025-09-30 07:29:38.353 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:29:38 compute-0 podman[218433]: 2025-09-30 07:29:38.489662919 +0000 UTC m=+0.076376165 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., io.buildah.version=1.33.7, vcs-type=git, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_id=edpm, release=1755695350, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers)
Sep 30 07:29:38 compute-0 nova_compute[189265]: 2025-09-30 07:29:38.651 2 DEBUG nova.network.neutron [req-f7274c4c-34f1-4caf-82f2-318c34edb515 req-1ff7faa2-4778-4357-ac4f-1dbf5260fbe6 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Updated VIF entry in instance network info cache for port 434f2b55-d79b-4459-9ed7-924027ebd4e4. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Sep 30 07:29:38 compute-0 nova_compute[189265]: 2025-09-30 07:29:38.652 2 DEBUG nova.network.neutron [req-f7274c4c-34f1-4caf-82f2-318c34edb515 req-1ff7faa2-4778-4357-ac4f-1dbf5260fbe6 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Updating instance_info_cache with network_info: [{"id": "434f2b55-d79b-4459-9ed7-924027ebd4e4", "address": "fa:16:3e:9a:b9:c8", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap434f2b55-d7", "ovs_interfaceid": "434f2b55-d79b-4459-9ed7-924027ebd4e4", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:29:38 compute-0 nova_compute[189265]: 2025-09-30 07:29:38.720 2 DEBUG nova.virt.libvirt.migration [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Sep 30 07:29:38 compute-0 nova_compute[189265]: 2025-09-30 07:29:38.720 2 INFO nova.virt.libvirt.migration [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Increasing downtime to 50 ms after 0 sec elapsed time
Sep 30 07:29:39 compute-0 nova_compute[189265]: 2025-09-30 07:29:39.158 2 DEBUG oslo_concurrency.lockutils [req-f7274c4c-34f1-4caf-82f2-318c34edb515 req-1ff7faa2-4778-4357-ac4f-1dbf5260fbe6 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Releasing lock "refresh_cache-c461f91a-e2a7-4222-a940-d8ab09ea4807" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:29:39 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:29:39.549 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=01429670-4ea1-4dab-babc-4bc628cc01bb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:29:39 compute-0 nova_compute[189265]: 2025-09-30 07:29:39.738 2 INFO nova.virt.libvirt.driver [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Sep 30 07:29:40 compute-0 kernel: tap434f2b55-d7 (unregistering): left promiscuous mode
Sep 30 07:29:40 compute-0 NetworkManager[51813]: <info>  [1759217380.1883] device (tap434f2b55-d7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 07:29:40 compute-0 ovn_controller[91436]: 2025-09-30T07:29:40Z|00157|binding|INFO|Releasing lport 434f2b55-d79b-4459-9ed7-924027ebd4e4 from this chassis (sb_readonly=0)
Sep 30 07:29:40 compute-0 ovn_controller[91436]: 2025-09-30T07:29:40Z|00158|binding|INFO|Setting lport 434f2b55-d79b-4459-9ed7-924027ebd4e4 down in Southbound
Sep 30 07:29:40 compute-0 ovn_controller[91436]: 2025-09-30T07:29:40Z|00159|binding|INFO|Removing iface tap434f2b55-d7 ovn-installed in OVS
Sep 30 07:29:40 compute-0 nova_compute[189265]: 2025-09-30 07:29:40.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:29:40 compute-0 nova_compute[189265]: 2025-09-30 07:29:40.201 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:29:40 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:29:40.204 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9a:b9:c8 10.100.0.6'], port_security=['fa:16:3e:9a:b9:c8 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '8a9138ed-8977-41ff-9b21-ff90eb637e78'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'c461f91a-e2a7-4222-a940-d8ab09ea4807', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c99c822b-3191-49e5-b938-903e25b4a9bb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6431607f3dce4c88bbf6d17ee6cd45b2', 'neutron:revision_number': '10', 'neutron:security_group_ids': '39e9818d-6ede-4a3d-b6e2-a5ad3a4c803a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0bbcb02d-e040-4e0e-9a60-6466c4420133, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], logical_port=434f2b55-d79b-4459-9ed7-924027ebd4e4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:29:40 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:29:40.206 100322 INFO neutron.agent.ovn.metadata.agent [-] Port 434f2b55-d79b-4459-9ed7-924027ebd4e4 in datapath c99c822b-3191-49e5-b938-903e25b4a9bb unbound from our chassis
Sep 30 07:29:40 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:29:40.208 100322 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c99c822b-3191-49e5-b938-903e25b4a9bb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 07:29:40 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:29:40.209 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[f7845278-0f60-4d41-8e3f-9bb4e44e7cbd]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:29:40 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:29:40.210 100322 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb namespace which is not needed anymore
Sep 30 07:29:40 compute-0 nova_compute[189265]: 2025-09-30 07:29:40.229 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:29:40 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000f.scope: Deactivated successfully.
Sep 30 07:29:40 compute-0 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d0000000f.scope: Consumed 13.888s CPU time.
Sep 30 07:29:40 compute-0 systemd-machined[149233]: Machine qemu-11-instance-0000000f terminated.
Sep 30 07:29:40 compute-0 neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb[218217]: [NOTICE]   (218221) : haproxy version is 3.0.5-8e879a5
Sep 30 07:29:40 compute-0 neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb[218217]: [NOTICE]   (218221) : path to executable is /usr/sbin/haproxy
Sep 30 07:29:40 compute-0 neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb[218217]: [WARNING]  (218221) : Exiting Master process...
Sep 30 07:29:40 compute-0 podman[218484]: 2025-09-30 07:29:40.336105357 +0000 UTC m=+0.026836771 container kill 65df65d723457449d37041eb73dbd6c712a1c63bc36677ed39f06efeb9531692 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 07:29:40 compute-0 neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb[218217]: [ALERT]    (218221) : Current worker (218223) exited with code 143 (Terminated)
Sep 30 07:29:40 compute-0 neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb[218217]: [WARNING]  (218221) : All workers exited. Exiting... (0)
Sep 30 07:29:40 compute-0 systemd[1]: libpod-65df65d723457449d37041eb73dbd6c712a1c63bc36677ed39f06efeb9531692.scope: Deactivated successfully.
Sep 30 07:29:40 compute-0 podman[218499]: 2025-09-30 07:29:40.375434561 +0000 UTC m=+0.020876592 container died 65df65d723457449d37041eb73dbd6c712a1c63bc36677ed39f06efeb9531692 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0)
Sep 30 07:29:40 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-65df65d723457449d37041eb73dbd6c712a1c63bc36677ed39f06efeb9531692-userdata-shm.mount: Deactivated successfully.
Sep 30 07:29:40 compute-0 systemd[1]: var-lib-containers-storage-overlay-89cca207cddb5cbbd568236f47be46be70b370057646a808bd608c63070bfea3-merged.mount: Deactivated successfully.
Sep 30 07:29:40 compute-0 nova_compute[189265]: 2025-09-30 07:29:40.418 2 DEBUG nova.virt.libvirt.guest [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Sep 30 07:29:40 compute-0 nova_compute[189265]: 2025-09-30 07:29:40.419 2 INFO nova.virt.libvirt.driver [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Migration operation has completed
Sep 30 07:29:40 compute-0 nova_compute[189265]: 2025-09-30 07:29:40.420 2 INFO nova.compute.manager [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] _post_live_migration() is started..
Sep 30 07:29:40 compute-0 nova_compute[189265]: 2025-09-30 07:29:40.422 2 DEBUG nova.virt.libvirt.driver [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Sep 30 07:29:40 compute-0 nova_compute[189265]: 2025-09-30 07:29:40.423 2 DEBUG nova.virt.libvirt.driver [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Sep 30 07:29:40 compute-0 nova_compute[189265]: 2025-09-30 07:29:40.423 2 DEBUG nova.virt.libvirt.driver [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Sep 30 07:29:40 compute-0 nova_compute[189265]: 2025-09-30 07:29:40.436 2 WARNING neutronclient.v2_0.client [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:29:40 compute-0 nova_compute[189265]: 2025-09-30 07:29:40.436 2 WARNING neutronclient.v2_0.client [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:29:40 compute-0 podman[218499]: 2025-09-30 07:29:40.441454102 +0000 UTC m=+0.086896113 container cleanup 65df65d723457449d37041eb73dbd6c712a1c63bc36677ed39f06efeb9531692 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Sep 30 07:29:40 compute-0 systemd[1]: libpod-conmon-65df65d723457449d37041eb73dbd6c712a1c63bc36677ed39f06efeb9531692.scope: Deactivated successfully.
Sep 30 07:29:40 compute-0 podman[218501]: 2025-09-30 07:29:40.467013786 +0000 UTC m=+0.106785216 container remove 65df65d723457449d37041eb73dbd6c712a1c63bc36677ed39f06efeb9531692 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS)
Sep 30 07:29:40 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:29:40.471 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[47cf56fb-e16c-446d-a5df-c3b0843d6f6e]: (4, ("Tue Sep 30 07:29:40 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb (65df65d723457449d37041eb73dbd6c712a1c63bc36677ed39f06efeb9531692)\n65df65d723457449d37041eb73dbd6c712a1c63bc36677ed39f06efeb9531692\nTue Sep 30 07:29:40 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb (65df65d723457449d37041eb73dbd6c712a1c63bc36677ed39f06efeb9531692)\n65df65d723457449d37041eb73dbd6c712a1c63bc36677ed39f06efeb9531692\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:29:40 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:29:40.472 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[e4e0da8f-792c-4eb3-bfc7-f7e592a3cd51]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:29:40 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:29:40.473 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:29:40 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:29:40.473 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[ca49d1e2-a036-4a2a-8dae-0141f8144a23]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:29:40 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:29:40.473 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc99c822b-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:29:40 compute-0 nova_compute[189265]: 2025-09-30 07:29:40.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:29:40 compute-0 kernel: tapc99c822b-30: left promiscuous mode
Sep 30 07:29:40 compute-0 nova_compute[189265]: 2025-09-30 07:29:40.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:29:40 compute-0 nova_compute[189265]: 2025-09-30 07:29:40.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:29:40 compute-0 nova_compute[189265]: 2025-09-30 07:29:40.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:29:40 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:29:40.490 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[785e4533-834b-45da-a2e5-36ea12e1765b]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:29:40 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:29:40.518 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[699a8cd5-b818-4421-9c1d-553ed40f1a6f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:29:40 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:29:40.519 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[348f33a0-cf08-4c6e-9ec7-0985770059e9]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:29:40 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:29:40.532 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[408ea830-03f6-42bb-8e6a-aa2278e023cc]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 518576, 'reachable_time': 34171, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218551, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:29:40 compute-0 systemd[1]: run-netns-ovnmeta\x2dc99c822b\x2d3191\x2d49e5\x2db938\x2d903e25b4a9bb.mount: Deactivated successfully.
Sep 30 07:29:40 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:29:40.536 100440 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Sep 30 07:29:40 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:29:40.536 100440 DEBUG oslo.privsep.daemon [-] privsep: reply[875f6d6e-0749-4fda-87ca-bfca063d8c60]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:29:40 compute-0 nova_compute[189265]: 2025-09-30 07:29:40.806 2 DEBUG nova.compute.manager [req-ac8fb614-7e05-4358-a9ca-e15d6a56b70e req-ac634355-012a-42a9-899d-56b7b7852ce7 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Received event network-vif-unplugged-434f2b55-d79b-4459-9ed7-924027ebd4e4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:29:40 compute-0 nova_compute[189265]: 2025-09-30 07:29:40.807 2 DEBUG oslo_concurrency.lockutils [req-ac8fb614-7e05-4358-a9ca-e15d6a56b70e req-ac634355-012a-42a9-899d-56b7b7852ce7 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "c461f91a-e2a7-4222-a940-d8ab09ea4807-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:29:40 compute-0 nova_compute[189265]: 2025-09-30 07:29:40.807 2 DEBUG oslo_concurrency.lockutils [req-ac8fb614-7e05-4358-a9ca-e15d6a56b70e req-ac634355-012a-42a9-899d-56b7b7852ce7 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "c461f91a-e2a7-4222-a940-d8ab09ea4807-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:29:40 compute-0 nova_compute[189265]: 2025-09-30 07:29:40.807 2 DEBUG oslo_concurrency.lockutils [req-ac8fb614-7e05-4358-a9ca-e15d6a56b70e req-ac634355-012a-42a9-899d-56b7b7852ce7 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "c461f91a-e2a7-4222-a940-d8ab09ea4807-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:29:40 compute-0 nova_compute[189265]: 2025-09-30 07:29:40.807 2 DEBUG nova.compute.manager [req-ac8fb614-7e05-4358-a9ca-e15d6a56b70e req-ac634355-012a-42a9-899d-56b7b7852ce7 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] No waiting events found dispatching network-vif-unplugged-434f2b55-d79b-4459-9ed7-924027ebd4e4 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:29:40 compute-0 nova_compute[189265]: 2025-09-30 07:29:40.807 2 DEBUG nova.compute.manager [req-ac8fb614-7e05-4358-a9ca-e15d6a56b70e req-ac634355-012a-42a9-899d-56b7b7852ce7 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Received event network-vif-unplugged-434f2b55-d79b-4459-9ed7-924027ebd4e4 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:29:41 compute-0 nova_compute[189265]: 2025-09-30 07:29:41.695 2 DEBUG nova.network.neutron [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Activated binding for port 434f2b55-d79b-4459-9ed7-924027ebd4e4 and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Sep 30 07:29:41 compute-0 nova_compute[189265]: 2025-09-30 07:29:41.696 2 DEBUG nova.compute.manager [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "434f2b55-d79b-4459-9ed7-924027ebd4e4", "address": "fa:16:3e:9a:b9:c8", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap434f2b55-d7", "ovs_interfaceid": "434f2b55-d79b-4459-9ed7-924027ebd4e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10059
Sep 30 07:29:41 compute-0 nova_compute[189265]: 2025-09-30 07:29:41.697 2 DEBUG nova.virt.libvirt.vif [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-09-30T07:28:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1498059428',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1498059428',id=15,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T07:28:50Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6431607f3dce4c88bbf6d17ee6cd45b2',ramdisk_id='',reservation_id='r-1pekgb53',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1096120513',owner_user_name='tempest-TestExecuteStrategies-1096120513-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T07:29:18Z,user_data=None,user_id='89ba5d19014145188ad2a3c812acdc88',uuid=c461f91a-e2a7-4222-a940-d8ab09ea4807,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "434f2b55-d79b-4459-9ed7-924027ebd4e4", "address": "fa:16:3e:9a:b9:c8", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap434f2b55-d7", "ovs_interfaceid": "434f2b55-d79b-4459-9ed7-924027ebd4e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 07:29:41 compute-0 nova_compute[189265]: 2025-09-30 07:29:41.698 2 DEBUG nova.network.os_vif_util [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Converting VIF {"id": "434f2b55-d79b-4459-9ed7-924027ebd4e4", "address": "fa:16:3e:9a:b9:c8", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap434f2b55-d7", "ovs_interfaceid": "434f2b55-d79b-4459-9ed7-924027ebd4e4", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:29:41 compute-0 nova_compute[189265]: 2025-09-30 07:29:41.699 2 DEBUG nova.network.os_vif_util [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9a:b9:c8,bridge_name='br-int',has_traffic_filtering=True,id=434f2b55-d79b-4459-9ed7-924027ebd4e4,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap434f2b55-d7') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:29:41 compute-0 nova_compute[189265]: 2025-09-30 07:29:41.699 2 DEBUG os_vif [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:b9:c8,bridge_name='br-int',has_traffic_filtering=True,id=434f2b55-d79b-4459-9ed7-924027ebd4e4,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap434f2b55-d7') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 07:29:41 compute-0 nova_compute[189265]: 2025-09-30 07:29:41.702 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:29:41 compute-0 nova_compute[189265]: 2025-09-30 07:29:41.703 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap434f2b55-d7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:29:41 compute-0 nova_compute[189265]: 2025-09-30 07:29:41.704 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:29:41 compute-0 nova_compute[189265]: 2025-09-30 07:29:41.707 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:29:41 compute-0 nova_compute[189265]: 2025-09-30 07:29:41.708 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:29:41 compute-0 nova_compute[189265]: 2025-09-30 07:29:41.708 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=11fe5315-73cc-4236-95f2-f863dedfb66c) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:29:41 compute-0 nova_compute[189265]: 2025-09-30 07:29:41.709 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:29:41 compute-0 nova_compute[189265]: 2025-09-30 07:29:41.711 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:29:41 compute-0 nova_compute[189265]: 2025-09-30 07:29:41.711 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:29:41 compute-0 nova_compute[189265]: 2025-09-30 07:29:41.714 2 INFO os_vif [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9a:b9:c8,bridge_name='br-int',has_traffic_filtering=True,id=434f2b55-d79b-4459-9ed7-924027ebd4e4,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap434f2b55-d7')
Sep 30 07:29:41 compute-0 nova_compute[189265]: 2025-09-30 07:29:41.714 2 DEBUG oslo_concurrency.lockutils [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:29:41 compute-0 nova_compute[189265]: 2025-09-30 07:29:41.714 2 DEBUG oslo_concurrency.lockutils [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:29:41 compute-0 nova_compute[189265]: 2025-09-30 07:29:41.715 2 DEBUG oslo_concurrency.lockutils [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:29:41 compute-0 nova_compute[189265]: 2025-09-30 07:29:41.715 2 DEBUG nova.compute.manager [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10082
Sep 30 07:29:41 compute-0 nova_compute[189265]: 2025-09-30 07:29:41.715 2 INFO nova.virt.libvirt.driver [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Deleting instance files /var/lib/nova/instances/c461f91a-e2a7-4222-a940-d8ab09ea4807_del
Sep 30 07:29:41 compute-0 nova_compute[189265]: 2025-09-30 07:29:41.716 2 INFO nova.virt.libvirt.driver [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Deletion of /var/lib/nova/instances/c461f91a-e2a7-4222-a940-d8ab09ea4807_del complete
Sep 30 07:29:42 compute-0 nova_compute[189265]: 2025-09-30 07:29:42.291 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:29:42 compute-0 nova_compute[189265]: 2025-09-30 07:29:42.291 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:29:42 compute-0 podman[218553]: 2025-09-30 07:29:42.48410541 +0000 UTC m=+0.065928859 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Sep 30 07:29:42 compute-0 podman[218552]: 2025-09-30 07:29:42.488209036 +0000 UTC m=+0.075420608 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=multipathd)
Sep 30 07:29:42 compute-0 podman[218559]: 2025-09-30 07:29:42.590168565 +0000 UTC m=+0.155479617 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0)
Sep 30 07:29:42 compute-0 nova_compute[189265]: 2025-09-30 07:29:42.878 2 DEBUG nova.compute.manager [req-0292ccf4-ca4b-4f00-8145-02bda3103100 req-8794b31f-975e-48e2-aaec-021cfb6601da 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Received event network-vif-plugged-434f2b55-d79b-4459-9ed7-924027ebd4e4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:29:42 compute-0 nova_compute[189265]: 2025-09-30 07:29:42.878 2 DEBUG oslo_concurrency.lockutils [req-0292ccf4-ca4b-4f00-8145-02bda3103100 req-8794b31f-975e-48e2-aaec-021cfb6601da 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "c461f91a-e2a7-4222-a940-d8ab09ea4807-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:29:42 compute-0 nova_compute[189265]: 2025-09-30 07:29:42.878 2 DEBUG oslo_concurrency.lockutils [req-0292ccf4-ca4b-4f00-8145-02bda3103100 req-8794b31f-975e-48e2-aaec-021cfb6601da 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "c461f91a-e2a7-4222-a940-d8ab09ea4807-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:29:42 compute-0 nova_compute[189265]: 2025-09-30 07:29:42.879 2 DEBUG oslo_concurrency.lockutils [req-0292ccf4-ca4b-4f00-8145-02bda3103100 req-8794b31f-975e-48e2-aaec-021cfb6601da 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "c461f91a-e2a7-4222-a940-d8ab09ea4807-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:29:42 compute-0 nova_compute[189265]: 2025-09-30 07:29:42.879 2 DEBUG nova.compute.manager [req-0292ccf4-ca4b-4f00-8145-02bda3103100 req-8794b31f-975e-48e2-aaec-021cfb6601da 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] No waiting events found dispatching network-vif-plugged-434f2b55-d79b-4459-9ed7-924027ebd4e4 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:29:42 compute-0 nova_compute[189265]: 2025-09-30 07:29:42.879 2 WARNING nova.compute.manager [req-0292ccf4-ca4b-4f00-8145-02bda3103100 req-8794b31f-975e-48e2-aaec-021cfb6601da 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Received unexpected event network-vif-plugged-434f2b55-d79b-4459-9ed7-924027ebd4e4 for instance with vm_state active and task_state migrating.
Sep 30 07:29:42 compute-0 nova_compute[189265]: 2025-09-30 07:29:42.880 2 DEBUG nova.compute.manager [req-0292ccf4-ca4b-4f00-8145-02bda3103100 req-8794b31f-975e-48e2-aaec-021cfb6601da 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Received event network-vif-unplugged-434f2b55-d79b-4459-9ed7-924027ebd4e4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:29:42 compute-0 nova_compute[189265]: 2025-09-30 07:29:42.880 2 DEBUG oslo_concurrency.lockutils [req-0292ccf4-ca4b-4f00-8145-02bda3103100 req-8794b31f-975e-48e2-aaec-021cfb6601da 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "c461f91a-e2a7-4222-a940-d8ab09ea4807-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:29:42 compute-0 nova_compute[189265]: 2025-09-30 07:29:42.880 2 DEBUG oslo_concurrency.lockutils [req-0292ccf4-ca4b-4f00-8145-02bda3103100 req-8794b31f-975e-48e2-aaec-021cfb6601da 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "c461f91a-e2a7-4222-a940-d8ab09ea4807-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:29:42 compute-0 nova_compute[189265]: 2025-09-30 07:29:42.880 2 DEBUG oslo_concurrency.lockutils [req-0292ccf4-ca4b-4f00-8145-02bda3103100 req-8794b31f-975e-48e2-aaec-021cfb6601da 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "c461f91a-e2a7-4222-a940-d8ab09ea4807-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:29:42 compute-0 nova_compute[189265]: 2025-09-30 07:29:42.881 2 DEBUG nova.compute.manager [req-0292ccf4-ca4b-4f00-8145-02bda3103100 req-8794b31f-975e-48e2-aaec-021cfb6601da 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] No waiting events found dispatching network-vif-unplugged-434f2b55-d79b-4459-9ed7-924027ebd4e4 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:29:42 compute-0 nova_compute[189265]: 2025-09-30 07:29:42.881 2 DEBUG nova.compute.manager [req-0292ccf4-ca4b-4f00-8145-02bda3103100 req-8794b31f-975e-48e2-aaec-021cfb6601da 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Received event network-vif-unplugged-434f2b55-d79b-4459-9ed7-924027ebd4e4 for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:29:42 compute-0 nova_compute[189265]: 2025-09-30 07:29:42.881 2 DEBUG nova.compute.manager [req-0292ccf4-ca4b-4f00-8145-02bda3103100 req-8794b31f-975e-48e2-aaec-021cfb6601da 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Received event network-vif-plugged-434f2b55-d79b-4459-9ed7-924027ebd4e4 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:29:42 compute-0 nova_compute[189265]: 2025-09-30 07:29:42.881 2 DEBUG oslo_concurrency.lockutils [req-0292ccf4-ca4b-4f00-8145-02bda3103100 req-8794b31f-975e-48e2-aaec-021cfb6601da 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "c461f91a-e2a7-4222-a940-d8ab09ea4807-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:29:42 compute-0 nova_compute[189265]: 2025-09-30 07:29:42.882 2 DEBUG oslo_concurrency.lockutils [req-0292ccf4-ca4b-4f00-8145-02bda3103100 req-8794b31f-975e-48e2-aaec-021cfb6601da 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "c461f91a-e2a7-4222-a940-d8ab09ea4807-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:29:42 compute-0 nova_compute[189265]: 2025-09-30 07:29:42.882 2 DEBUG oslo_concurrency.lockutils [req-0292ccf4-ca4b-4f00-8145-02bda3103100 req-8794b31f-975e-48e2-aaec-021cfb6601da 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "c461f91a-e2a7-4222-a940-d8ab09ea4807-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:29:42 compute-0 nova_compute[189265]: 2025-09-30 07:29:42.882 2 DEBUG nova.compute.manager [req-0292ccf4-ca4b-4f00-8145-02bda3103100 req-8794b31f-975e-48e2-aaec-021cfb6601da 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] No waiting events found dispatching network-vif-plugged-434f2b55-d79b-4459-9ed7-924027ebd4e4 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:29:42 compute-0 nova_compute[189265]: 2025-09-30 07:29:42.882 2 WARNING nova.compute.manager [req-0292ccf4-ca4b-4f00-8145-02bda3103100 req-8794b31f-975e-48e2-aaec-021cfb6601da 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Received unexpected event network-vif-plugged-434f2b55-d79b-4459-9ed7-924027ebd4e4 for instance with vm_state active and task_state migrating.
Sep 30 07:29:43 compute-0 nova_compute[189265]: 2025-09-30 07:29:43.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:29:43 compute-0 nova_compute[189265]: 2025-09-30 07:29:43.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:29:44 compute-0 nova_compute[189265]: 2025-09-30 07:29:44.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:29:44 compute-0 nova_compute[189265]: 2025-09-30 07:29:44.788 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:29:46 compute-0 nova_compute[189265]: 2025-09-30 07:29:46.710 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:29:47 compute-0 nova_compute[189265]: 2025-09-30 07:29:47.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:29:48 compute-0 nova_compute[189265]: 2025-09-30 07:29:48.302 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:29:48 compute-0 nova_compute[189265]: 2025-09-30 07:29:48.302 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:29:48 compute-0 nova_compute[189265]: 2025-09-30 07:29:48.303 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:29:48 compute-0 nova_compute[189265]: 2025-09-30 07:29:48.303 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:29:48 compute-0 nova_compute[189265]: 2025-09-30 07:29:48.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:29:48 compute-0 nova_compute[189265]: 2025-09-30 07:29:48.543 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:29:48 compute-0 nova_compute[189265]: 2025-09-30 07:29:48.545 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:29:48 compute-0 nova_compute[189265]: 2025-09-30 07:29:48.576 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.031s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:29:48 compute-0 nova_compute[189265]: 2025-09-30 07:29:48.577 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5860MB free_disk=73.303955078125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:29:48 compute-0 nova_compute[189265]: 2025-09-30 07:29:48.578 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:29:48 compute-0 nova_compute[189265]: 2025-09-30 07:29:48.578 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:29:49 compute-0 nova_compute[189265]: 2025-09-30 07:29:49.600 2 INFO nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Updating resource usage from migration c519074c-77cb-4d2d-a9a3-bb6c93d56bdd
Sep 30 07:29:49 compute-0 nova_compute[189265]: 2025-09-30 07:29:49.725 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Migration c519074c-77cb-4d2d-a9a3-bb6c93d56bdd is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Sep 30 07:29:49 compute-0 nova_compute[189265]: 2025-09-30 07:29:49.725 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:29:49 compute-0 nova_compute[189265]: 2025-09-30 07:29:49.725 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:29:48 up  1:27,  0 user,  load average: 0.29, 0.21, 0.31\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_migrating': '1', 'num_os_type_None': '1', 'num_proj_6431607f3dce4c88bbf6d17ee6cd45b2': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:29:49 compute-0 nova_compute[189265]: 2025-09-30 07:29:49.801 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Refreshing inventories for resource provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Sep 30 07:29:49 compute-0 nova_compute[189265]: 2025-09-30 07:29:49.884 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Updating ProviderTree inventory for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Sep 30 07:29:49 compute-0 nova_compute[189265]: 2025-09-30 07:29:49.885 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Updating inventory in ProviderTree for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Sep 30 07:29:49 compute-0 nova_compute[189265]: 2025-09-30 07:29:49.909 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Refreshing aggregate associations for resource provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Sep 30 07:29:49 compute-0 nova_compute[189265]: 2025-09-30 07:29:49.938 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Refreshing trait associations for resource provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc, traits: COMPUTE_SECURITY_TPM_CRB,HW_ARCH_X86_64,HW_CPU_X86_F16C,COMPUTE_STATUS_DISABLED,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AESNI,COMPUTE_STORAGE_VIRTIO_FS,HW_CPU_X86_FMA3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE,COMPUTE_ACCELERATORS,HW_CPU_X86_SVM,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_EXTEND,COMPUTE_ARCH_X86_64,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SHA,HW_CPU_X86_BMI,COMPUTE_SOUND_MODEL_USB,COMPUTE_SOUND_MODEL_SB16,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AMD_SVM,HW_CPU_X86_BMI2,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_TIS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_AVX,COMPUTE_SOUND_MODEL_AC97,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_ABM,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_VIF_MODEL_IGB,HW_CPU_X86_SSE41,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SOUND_MODEL_ICH6,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_MMX,HW_CPU_X86_SSE4A,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_PCSPK,HW_CPU_X86_CLMUL _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Sep 30 07:29:49 compute-0 nova_compute[189265]: 2025-09-30 07:29:49.979 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:29:50 compute-0 nova_compute[189265]: 2025-09-30 07:29:50.487 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:29:51 compute-0 nova_compute[189265]: 2025-09-30 07:29:51.000 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:29:51 compute-0 nova_compute[189265]: 2025-09-30 07:29:51.000 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.422s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:29:51 compute-0 nova_compute[189265]: 2025-09-30 07:29:51.712 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:29:51 compute-0 nova_compute[189265]: 2025-09-30 07:29:51.755 2 DEBUG oslo_concurrency.lockutils [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "c461f91a-e2a7-4222-a940-d8ab09ea4807-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:29:51 compute-0 nova_compute[189265]: 2025-09-30 07:29:51.756 2 DEBUG oslo_concurrency.lockutils [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "c461f91a-e2a7-4222-a940-d8ab09ea4807-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:29:51 compute-0 nova_compute[189265]: 2025-09-30 07:29:51.756 2 DEBUG oslo_concurrency.lockutils [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "c461f91a-e2a7-4222-a940-d8ab09ea4807-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:29:52 compute-0 nova_compute[189265]: 2025-09-30 07:29:52.273 2 DEBUG oslo_concurrency.lockutils [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:29:52 compute-0 nova_compute[189265]: 2025-09-30 07:29:52.274 2 DEBUG oslo_concurrency.lockutils [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:29:52 compute-0 nova_compute[189265]: 2025-09-30 07:29:52.274 2 DEBUG oslo_concurrency.lockutils [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:29:52 compute-0 nova_compute[189265]: 2025-09-30 07:29:52.275 2 DEBUG nova.compute.resource_tracker [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:29:52 compute-0 nova_compute[189265]: 2025-09-30 07:29:52.487 2 WARNING nova.virt.libvirt.driver [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:29:52 compute-0 nova_compute[189265]: 2025-09-30 07:29:52.488 2 DEBUG oslo_concurrency.processutils [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:29:52 compute-0 nova_compute[189265]: 2025-09-30 07:29:52.517 2 DEBUG oslo_concurrency.processutils [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "env LANG=C uptime" returned: 0 in 0.028s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:29:52 compute-0 nova_compute[189265]: 2025-09-30 07:29:52.518 2 DEBUG nova.compute.resource_tracker [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5858MB free_disk=73.30394744873047GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:29:52 compute-0 nova_compute[189265]: 2025-09-30 07:29:52.518 2 DEBUG oslo_concurrency.lockutils [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:29:52 compute-0 nova_compute[189265]: 2025-09-30 07:29:52.519 2 DEBUG oslo_concurrency.lockutils [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:29:52 compute-0 nova_compute[189265]: 2025-09-30 07:29:52.996 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:29:53 compute-0 nova_compute[189265]: 2025-09-30 07:29:53.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:29:53 compute-0 nova_compute[189265]: 2025-09-30 07:29:53.504 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:29:53 compute-0 nova_compute[189265]: 2025-09-30 07:29:53.504 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:29:53 compute-0 nova_compute[189265]: 2025-09-30 07:29:53.539 2 DEBUG nova.compute.resource_tracker [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Migration for instance c461f91a-e2a7-4222-a940-d8ab09ea4807 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Sep 30 07:29:54 compute-0 nova_compute[189265]: 2025-09-30 07:29:54.047 2 DEBUG nova.compute.resource_tracker [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Sep 30 07:29:54 compute-0 nova_compute[189265]: 2025-09-30 07:29:54.080 2 DEBUG nova.compute.resource_tracker [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Migration c519074c-77cb-4d2d-a9a3-bb6c93d56bdd is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Sep 30 07:29:54 compute-0 nova_compute[189265]: 2025-09-30 07:29:54.080 2 DEBUG nova.compute.resource_tracker [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:29:54 compute-0 nova_compute[189265]: 2025-09-30 07:29:54.081 2 DEBUG nova.compute.resource_tracker [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:29:52 up  1:27,  0 user,  load average: 0.29, 0.21, 0.31\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:29:54 compute-0 nova_compute[189265]: 2025-09-30 07:29:54.131 2 DEBUG nova.compute.provider_tree [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:29:54 compute-0 nova_compute[189265]: 2025-09-30 07:29:54.639 2 DEBUG nova.scheduler.client.report [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:29:54 compute-0 nova_compute[189265]: 2025-09-30 07:29:54.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:29:55 compute-0 nova_compute[189265]: 2025-09-30 07:29:55.149 2 DEBUG nova.compute.resource_tracker [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:29:55 compute-0 nova_compute[189265]: 2025-09-30 07:29:55.149 2 DEBUG oslo_concurrency.lockutils [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.630s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:29:55 compute-0 nova_compute[189265]: 2025-09-30 07:29:55.170 2 INFO nova.compute.manager [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Sep 30 07:29:56 compute-0 nova_compute[189265]: 2025-09-30 07:29:56.277 2 INFO nova.scheduler.client.report [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Deleted allocation for migration c519074c-77cb-4d2d-a9a3-bb6c93d56bdd
Sep 30 07:29:56 compute-0 nova_compute[189265]: 2025-09-30 07:29:56.278 2 DEBUG nova.virt.libvirt.driver [None req-1496f62f-5eb1-415b-be58-76c32e09f125 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c461f91a-e2a7-4222-a940-d8ab09ea4807] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Sep 30 07:29:56 compute-0 nova_compute[189265]: 2025-09-30 07:29:56.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:29:57 compute-0 podman[218620]: 2025-09-30 07:29:57.497339457 +0000 UTC m=+0.078008991 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 07:29:58 compute-0 nova_compute[189265]: 2025-09-30 07:29:58.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:29:59 compute-0 podman[199733]: time="2025-09-30T07:29:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:29:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:29:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:29:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:29:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3008 "" "Go-http-client/1.1"
Sep 30 07:30:01 compute-0 openstack_network_exporter[201859]: ERROR   07:30:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:30:01 compute-0 openstack_network_exporter[201859]: ERROR   07:30:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:30:01 compute-0 openstack_network_exporter[201859]: ERROR   07:30:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:30:01 compute-0 openstack_network_exporter[201859]: ERROR   07:30:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:30:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:30:01 compute-0 openstack_network_exporter[201859]: ERROR   07:30:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:30:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:30:01 compute-0 nova_compute[189265]: 2025-09-30 07:30:01.716 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:30:03 compute-0 nova_compute[189265]: 2025-09-30 07:30:03.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:30:06 compute-0 podman[218645]: 2025-09-30 07:30:06.479883335 +0000 UTC m=+0.070253122 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Sep 30 07:30:06 compute-0 nova_compute[189265]: 2025-09-30 07:30:06.722 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:30:08 compute-0 nova_compute[189265]: 2025-09-30 07:30:08.370 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:30:09 compute-0 podman[218664]: 2025-09-30 07:30:09.51380296 +0000 UTC m=+0.091623487 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9)
Sep 30 07:30:09 compute-0 nova_compute[189265]: 2025-09-30 07:30:09.713 2 DEBUG nova.compute.manager [None req-5820f55a-3c4b-493b-93ad-68e5f0e5784c bddd62d17bac483fb429dd18b1062646 4049964ce8244dacb50493f6676c6613 - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:631
Sep 30 07:30:09 compute-0 nova_compute[189265]: 2025-09-30 07:30:09.757 2 DEBUG nova.compute.provider_tree [None req-5820f55a-3c4b-493b-93ad-68e5f0e5784c bddd62d17bac483fb429dd18b1062646 4049964ce8244dacb50493f6676c6613 - - default default] Updating resource provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc generation from 20 to 22 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Sep 30 07:30:11 compute-0 nova_compute[189265]: 2025-09-30 07:30:11.725 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:30:13 compute-0 nova_compute[189265]: 2025-09-30 07:30:13.373 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:30:13 compute-0 podman[218685]: 2025-09-30 07:30:13.520939292 +0000 UTC m=+0.094592831 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Sep 30 07:30:13 compute-0 podman[218686]: 2025-09-30 07:30:13.536253266 +0000 UTC m=+0.107490447 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=watcher_latest)
Sep 30 07:30:13 compute-0 podman[218687]: 2025-09-30 07:30:13.576248399 +0000 UTC m=+0.142768146 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.4)
Sep 30 07:30:16 compute-0 nova_compute[189265]: 2025-09-30 07:30:16.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:30:18 compute-0 nova_compute[189265]: 2025-09-30 07:30:18.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:30:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:30:20.560 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:30:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:30:20.560 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:30:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:30:20.560 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:30:21 compute-0 nova_compute[189265]: 2025-09-30 07:30:21.731 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:30:23 compute-0 nova_compute[189265]: 2025-09-30 07:30:23.375 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:30:26 compute-0 nova_compute[189265]: 2025-09-30 07:30:26.733 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:30:28 compute-0 nova_compute[189265]: 2025-09-30 07:30:28.377 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:30:28 compute-0 podman[218748]: 2025-09-30 07:30:28.477291927 +0000 UTC m=+0.061023570 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 07:30:29 compute-0 podman[199733]: time="2025-09-30T07:30:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:30:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:30:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:30:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:30:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3012 "" "Go-http-client/1.1"
Sep 30 07:30:31 compute-0 openstack_network_exporter[201859]: ERROR   07:30:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:30:31 compute-0 openstack_network_exporter[201859]: ERROR   07:30:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:30:31 compute-0 openstack_network_exporter[201859]: ERROR   07:30:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:30:31 compute-0 openstack_network_exporter[201859]: ERROR   07:30:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:30:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:30:31 compute-0 openstack_network_exporter[201859]: ERROR   07:30:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:30:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:30:31 compute-0 nova_compute[189265]: 2025-09-30 07:30:31.736 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:30:33 compute-0 nova_compute[189265]: 2025-09-30 07:30:33.425 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:30:36 compute-0 nova_compute[189265]: 2025-09-30 07:30:36.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:30:37 compute-0 podman[218772]: 2025-09-30 07:30:37.499435799 +0000 UTC m=+0.083344942 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Sep 30 07:30:38 compute-0 nova_compute[189265]: 2025-09-30 07:30:38.428 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:30:38 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:30:38.810 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '1a:26:7c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2e:60:fa:91:d0:34'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:30:38 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:30:38.811 100322 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 07:30:38 compute-0 nova_compute[189265]: 2025-09-30 07:30:38.885 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:30:40 compute-0 podman[218793]: 2025-09-30 07:30:40.492519418 +0000 UTC m=+0.080466091 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter)
Sep 30 07:30:41 compute-0 nova_compute[189265]: 2025-09-30 07:30:41.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:30:41 compute-0 nova_compute[189265]: 2025-09-30 07:30:41.783 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:30:42 compute-0 nova_compute[189265]: 2025-09-30 07:30:42.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:30:43 compute-0 nova_compute[189265]: 2025-09-30 07:30:43.429 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:30:44 compute-0 podman[218815]: 2025-09-30 07:30:44.499835335 +0000 UTC m=+0.064770616 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Sep 30 07:30:44 compute-0 podman[218814]: 2025-09-30 07:30:44.501405599 +0000 UTC m=+0.082169899 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Sep 30 07:30:44 compute-0 podman[218821]: 2025-09-30 07:30:44.535420763 +0000 UTC m=+0.102587748 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Sep 30 07:30:44 compute-0 nova_compute[189265]: 2025-09-30 07:30:44.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:30:44 compute-0 nova_compute[189265]: 2025-09-30 07:30:44.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:30:44 compute-0 nova_compute[189265]: 2025-09-30 07:30:44.787 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:30:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:30:44.812 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=01429670-4ea1-4dab-babc-4bc628cc01bb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:30:46 compute-0 nova_compute[189265]: 2025-09-30 07:30:46.652 2 DEBUG oslo_concurrency.lockutils [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquiring lock "a6ffd09b-ce40-4418-87d0-5555a8f04f67" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:30:46 compute-0 nova_compute[189265]: 2025-09-30 07:30:46.653 2 DEBUG oslo_concurrency.lockutils [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "a6ffd09b-ce40-4418-87d0-5555a8f04f67" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:30:46 compute-0 nova_compute[189265]: 2025-09-30 07:30:46.788 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:30:47 compute-0 nova_compute[189265]: 2025-09-30 07:30:47.160 2 DEBUG nova.compute.manager [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Sep 30 07:30:47 compute-0 nova_compute[189265]: 2025-09-30 07:30:47.748 2 DEBUG oslo_concurrency.lockutils [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:30:47 compute-0 nova_compute[189265]: 2025-09-30 07:30:47.748 2 DEBUG oslo_concurrency.lockutils [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:30:47 compute-0 nova_compute[189265]: 2025-09-30 07:30:47.756 2 DEBUG nova.virt.hardware [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Sep 30 07:30:47 compute-0 nova_compute[189265]: 2025-09-30 07:30:47.757 2 INFO nova.compute.claims [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Claim successful on node compute-0.ctlplane.example.com
Sep 30 07:30:48 compute-0 nova_compute[189265]: 2025-09-30 07:30:48.431 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:30:48 compute-0 nova_compute[189265]: 2025-09-30 07:30:48.828 2 DEBUG nova.compute.provider_tree [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:30:49 compute-0 nova_compute[189265]: 2025-09-30 07:30:49.337 2 DEBUG nova.scheduler.client.report [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:30:49 compute-0 nova_compute[189265]: 2025-09-30 07:30:49.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:30:49 compute-0 nova_compute[189265]: 2025-09-30 07:30:49.849 2 DEBUG oslo_concurrency.lockutils [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.101s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:30:49 compute-0 nova_compute[189265]: 2025-09-30 07:30:49.850 2 DEBUG nova.compute.manager [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Sep 30 07:30:50 compute-0 nova_compute[189265]: 2025-09-30 07:30:50.301 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:30:50 compute-0 nova_compute[189265]: 2025-09-30 07:30:50.302 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:30:50 compute-0 nova_compute[189265]: 2025-09-30 07:30:50.302 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:30:50 compute-0 nova_compute[189265]: 2025-09-30 07:30:50.302 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:30:50 compute-0 nova_compute[189265]: 2025-09-30 07:30:50.364 2 DEBUG nova.compute.manager [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Sep 30 07:30:50 compute-0 nova_compute[189265]: 2025-09-30 07:30:50.365 2 DEBUG nova.network.neutron [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Sep 30 07:30:50 compute-0 nova_compute[189265]: 2025-09-30 07:30:50.365 2 WARNING neutronclient.v2_0.client [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:30:50 compute-0 nova_compute[189265]: 2025-09-30 07:30:50.366 2 WARNING neutronclient.v2_0.client [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:30:50 compute-0 nova_compute[189265]: 2025-09-30 07:30:50.482 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:30:50 compute-0 nova_compute[189265]: 2025-09-30 07:30:50.483 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:30:50 compute-0 nova_compute[189265]: 2025-09-30 07:30:50.502 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.020s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:30:50 compute-0 nova_compute[189265]: 2025-09-30 07:30:50.503 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5861MB free_disk=73.30394744873047GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:30:50 compute-0 nova_compute[189265]: 2025-09-30 07:30:50.503 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:30:50 compute-0 nova_compute[189265]: 2025-09-30 07:30:50.504 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:30:50 compute-0 nova_compute[189265]: 2025-09-30 07:30:50.874 2 INFO nova.virt.libvirt.driver [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 07:30:51 compute-0 nova_compute[189265]: 2025-09-30 07:30:51.124 2 DEBUG nova.network.neutron [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Successfully created port: 5406a6ea-407c-46b1-b791-a508e191918e _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Sep 30 07:30:51 compute-0 nova_compute[189265]: 2025-09-30 07:30:51.385 2 DEBUG nova.compute.manager [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Sep 30 07:30:51 compute-0 nova_compute[189265]: 2025-09-30 07:30:51.545 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Instance a6ffd09b-ce40-4418-87d0-5555a8f04f67 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 07:30:51 compute-0 nova_compute[189265]: 2025-09-30 07:30:51.545 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:30:51 compute-0 nova_compute[189265]: 2025-09-30 07:30:51.545 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:30:50 up  1:28,  0 user,  load average: 0.10, 0.17, 0.28\n', 'num_instances': '1', 'num_vm_building': '1', 'num_task_networking': '1', 'num_os_type_None': '1', 'num_proj_6431607f3dce4c88bbf6d17ee6cd45b2': '1', 'io_workload': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:30:51 compute-0 nova_compute[189265]: 2025-09-30 07:30:51.584 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:30:51 compute-0 nova_compute[189265]: 2025-09-30 07:30:51.791 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:30:51 compute-0 nova_compute[189265]: 2025-09-30 07:30:51.986 2 DEBUG nova.network.neutron [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Successfully updated port: 5406a6ea-407c-46b1-b791-a508e191918e _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Sep 30 07:30:52 compute-0 nova_compute[189265]: 2025-09-30 07:30:52.060 2 DEBUG nova.compute.manager [req-2b921a79-59e3-49f5-a278-65a0db8ca8af req-46d94ebe-520c-4d15-b7e5-1e5eba6f903d 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Received event network-changed-5406a6ea-407c-46b1-b791-a508e191918e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:30:52 compute-0 nova_compute[189265]: 2025-09-30 07:30:52.061 2 DEBUG nova.compute.manager [req-2b921a79-59e3-49f5-a278-65a0db8ca8af req-46d94ebe-520c-4d15-b7e5-1e5eba6f903d 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Refreshing instance network info cache due to event network-changed-5406a6ea-407c-46b1-b791-a508e191918e. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 07:30:52 compute-0 nova_compute[189265]: 2025-09-30 07:30:52.061 2 DEBUG oslo_concurrency.lockutils [req-2b921a79-59e3-49f5-a278-65a0db8ca8af req-46d94ebe-520c-4d15-b7e5-1e5eba6f903d 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "refresh_cache-a6ffd09b-ce40-4418-87d0-5555a8f04f67" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:30:52 compute-0 nova_compute[189265]: 2025-09-30 07:30:52.061 2 DEBUG oslo_concurrency.lockutils [req-2b921a79-59e3-49f5-a278-65a0db8ca8af req-46d94ebe-520c-4d15-b7e5-1e5eba6f903d 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquired lock "refresh_cache-a6ffd09b-ce40-4418-87d0-5555a8f04f67" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:30:52 compute-0 nova_compute[189265]: 2025-09-30 07:30:52.062 2 DEBUG nova.network.neutron [req-2b921a79-59e3-49f5-a278-65a0db8ca8af req-46d94ebe-520c-4d15-b7e5-1e5eba6f903d 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Refreshing network info cache for port 5406a6ea-407c-46b1-b791-a508e191918e _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 07:30:52 compute-0 nova_compute[189265]: 2025-09-30 07:30:52.129 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:30:52 compute-0 nova_compute[189265]: 2025-09-30 07:30:52.406 2 DEBUG nova.compute.manager [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Sep 30 07:30:52 compute-0 nova_compute[189265]: 2025-09-30 07:30:52.407 2 DEBUG nova.virt.libvirt.driver [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Sep 30 07:30:52 compute-0 nova_compute[189265]: 2025-09-30 07:30:52.408 2 INFO nova.virt.libvirt.driver [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Creating image(s)
Sep 30 07:30:52 compute-0 nova_compute[189265]: 2025-09-30 07:30:52.409 2 DEBUG oslo_concurrency.lockutils [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquiring lock "/var/lib/nova/instances/a6ffd09b-ce40-4418-87d0-5555a8f04f67/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:30:52 compute-0 nova_compute[189265]: 2025-09-30 07:30:52.409 2 DEBUG oslo_concurrency.lockutils [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "/var/lib/nova/instances/a6ffd09b-ce40-4418-87d0-5555a8f04f67/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:30:52 compute-0 nova_compute[189265]: 2025-09-30 07:30:52.410 2 DEBUG oslo_concurrency.lockutils [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "/var/lib/nova/instances/a6ffd09b-ce40-4418-87d0-5555a8f04f67/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:30:52 compute-0 nova_compute[189265]: 2025-09-30 07:30:52.411 2 DEBUG oslo_utils.imageutils.format_inspector [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:30:52 compute-0 nova_compute[189265]: 2025-09-30 07:30:52.417 2 DEBUG oslo_utils.imageutils.format_inspector [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:30:52 compute-0 nova_compute[189265]: 2025-09-30 07:30:52.419 2 DEBUG oslo_concurrency.processutils [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:30:52 compute-0 nova_compute[189265]: 2025-09-30 07:30:52.495 2 DEBUG oslo_concurrency.lockutils [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquiring lock "refresh_cache-a6ffd09b-ce40-4418-87d0-5555a8f04f67" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:30:52 compute-0 nova_compute[189265]: 2025-09-30 07:30:52.510 2 DEBUG oslo_concurrency.processutils [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:30:52 compute-0 nova_compute[189265]: 2025-09-30 07:30:52.510 2 DEBUG oslo_concurrency.lockutils [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquiring lock "649c128805005f3dfb5a93843c58a367cdfe939d" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:30:52 compute-0 nova_compute[189265]: 2025-09-30 07:30:52.511 2 DEBUG oslo_concurrency.lockutils [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "649c128805005f3dfb5a93843c58a367cdfe939d" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:30:52 compute-0 nova_compute[189265]: 2025-09-30 07:30:52.512 2 DEBUG oslo_utils.imageutils.format_inspector [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:30:52 compute-0 nova_compute[189265]: 2025-09-30 07:30:52.518 2 DEBUG oslo_utils.imageutils.format_inspector [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:30:52 compute-0 nova_compute[189265]: 2025-09-30 07:30:52.519 2 DEBUG oslo_concurrency.processutils [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:30:52 compute-0 nova_compute[189265]: 2025-09-30 07:30:52.575 2 WARNING neutronclient.v2_0.client [req-2b921a79-59e3-49f5-a278-65a0db8ca8af req-46d94ebe-520c-4d15-b7e5-1e5eba6f903d 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:30:52 compute-0 nova_compute[189265]: 2025-09-30 07:30:52.584 2 DEBUG oslo_concurrency.processutils [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:30:52 compute-0 nova_compute[189265]: 2025-09-30 07:30:52.586 2 DEBUG oslo_concurrency.processutils [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d,backing_fmt=raw /var/lib/nova/instances/a6ffd09b-ce40-4418-87d0-5555a8f04f67/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:30:52 compute-0 nova_compute[189265]: 2025-09-30 07:30:52.631 2 DEBUG oslo_concurrency.processutils [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d,backing_fmt=raw /var/lib/nova/instances/a6ffd09b-ce40-4418-87d0-5555a8f04f67/disk 1073741824" returned: 0 in 0.046s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:30:52 compute-0 nova_compute[189265]: 2025-09-30 07:30:52.632 2 DEBUG oslo_concurrency.lockutils [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "649c128805005f3dfb5a93843c58a367cdfe939d" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.121s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:30:52 compute-0 nova_compute[189265]: 2025-09-30 07:30:52.633 2 DEBUG oslo_concurrency.processutils [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:30:52 compute-0 nova_compute[189265]: 2025-09-30 07:30:52.643 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:30:52 compute-0 nova_compute[189265]: 2025-09-30 07:30:52.644 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.140s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:30:52 compute-0 nova_compute[189265]: 2025-09-30 07:30:52.683 2 DEBUG oslo_concurrency.processutils [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:30:52 compute-0 nova_compute[189265]: 2025-09-30 07:30:52.684 2 DEBUG nova.virt.disk.api [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Checking if we can resize image /var/lib/nova/instances/a6ffd09b-ce40-4418-87d0-5555a8f04f67/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Sep 30 07:30:52 compute-0 nova_compute[189265]: 2025-09-30 07:30:52.684 2 DEBUG oslo_concurrency.processutils [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a6ffd09b-ce40-4418-87d0-5555a8f04f67/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:30:52 compute-0 nova_compute[189265]: 2025-09-30 07:30:52.752 2 DEBUG oslo_concurrency.processutils [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a6ffd09b-ce40-4418-87d0-5555a8f04f67/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:30:52 compute-0 nova_compute[189265]: 2025-09-30 07:30:52.752 2 DEBUG nova.virt.disk.api [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Cannot resize image /var/lib/nova/instances/a6ffd09b-ce40-4418-87d0-5555a8f04f67/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Sep 30 07:30:52 compute-0 nova_compute[189265]: 2025-09-30 07:30:52.753 2 DEBUG nova.virt.libvirt.driver [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Sep 30 07:30:52 compute-0 nova_compute[189265]: 2025-09-30 07:30:52.753 2 DEBUG nova.virt.libvirt.driver [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Ensure instance console log exists: /var/lib/nova/instances/a6ffd09b-ce40-4418-87d0-5555a8f04f67/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 07:30:52 compute-0 nova_compute[189265]: 2025-09-30 07:30:52.753 2 DEBUG oslo_concurrency.lockutils [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:30:52 compute-0 nova_compute[189265]: 2025-09-30 07:30:52.753 2 DEBUG oslo_concurrency.lockutils [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:30:52 compute-0 nova_compute[189265]: 2025-09-30 07:30:52.754 2 DEBUG oslo_concurrency.lockutils [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:30:53 compute-0 nova_compute[189265]: 2025-09-30 07:30:53.432 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:30:53 compute-0 nova_compute[189265]: 2025-09-30 07:30:53.645 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:30:53 compute-0 nova_compute[189265]: 2025-09-30 07:30:53.646 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:30:53 compute-0 nova_compute[189265]: 2025-09-30 07:30:53.700 2 DEBUG nova.network.neutron [req-2b921a79-59e3-49f5-a278-65a0db8ca8af req-46d94ebe-520c-4d15-b7e5-1e5eba6f903d 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 07:30:54 compute-0 nova_compute[189265]: 2025-09-30 07:30:54.338 2 DEBUG nova.network.neutron [req-2b921a79-59e3-49f5-a278-65a0db8ca8af req-46d94ebe-520c-4d15-b7e5-1e5eba6f903d 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:30:54 compute-0 nova_compute[189265]: 2025-09-30 07:30:54.845 2 DEBUG oslo_concurrency.lockutils [req-2b921a79-59e3-49f5-a278-65a0db8ca8af req-46d94ebe-520c-4d15-b7e5-1e5eba6f903d 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Releasing lock "refresh_cache-a6ffd09b-ce40-4418-87d0-5555a8f04f67" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:30:54 compute-0 nova_compute[189265]: 2025-09-30 07:30:54.845 2 DEBUG oslo_concurrency.lockutils [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquired lock "refresh_cache-a6ffd09b-ce40-4418-87d0-5555a8f04f67" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:30:54 compute-0 nova_compute[189265]: 2025-09-30 07:30:54.846 2 DEBUG nova.network.neutron [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 07:30:55 compute-0 nova_compute[189265]: 2025-09-30 07:30:55.716 2 DEBUG nova.network.neutron [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 07:30:55 compute-0 nova_compute[189265]: 2025-09-30 07:30:55.982 2 WARNING neutronclient.v2_0.client [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:30:56 compute-0 nova_compute[189265]: 2025-09-30 07:30:56.187 2 DEBUG nova.network.neutron [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Updating instance_info_cache with network_info: [{"id": "5406a6ea-407c-46b1-b791-a508e191918e", "address": "fa:16:3e:0f:9f:77", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5406a6ea-40", "ovs_interfaceid": "5406a6ea-407c-46b1-b791-a508e191918e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:30:56 compute-0 nova_compute[189265]: 2025-09-30 07:30:56.697 2 DEBUG oslo_concurrency.lockutils [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Releasing lock "refresh_cache-a6ffd09b-ce40-4418-87d0-5555a8f04f67" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:30:56 compute-0 nova_compute[189265]: 2025-09-30 07:30:56.697 2 DEBUG nova.compute.manager [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Instance network_info: |[{"id": "5406a6ea-407c-46b1-b791-a508e191918e", "address": "fa:16:3e:0f:9f:77", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5406a6ea-40", "ovs_interfaceid": "5406a6ea-407c-46b1-b791-a508e191918e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Sep 30 07:30:56 compute-0 nova_compute[189265]: 2025-09-30 07:30:56.699 2 DEBUG nova.virt.libvirt.driver [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Start _get_guest_xml network_info=[{"id": "5406a6ea-407c-46b1-b791-a508e191918e", "address": "fa:16:3e:0f:9f:77", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5406a6ea-40", "ovs_interfaceid": "5406a6ea-407c-46b1-b791-a508e191918e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T07:07:59Z,direct_url=<?>,disk_format='qcow2',id=0c6b92f5-9861-49e4-862d-3ffd84520dfa,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4049964ce8244dacb50493f6676c6613',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T07:08:00Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'encryption_format': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encrypted': False, 'image_id': '0c6b92f5-9861-49e4-862d-3ffd84520dfa'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Sep 30 07:30:56 compute-0 nova_compute[189265]: 2025-09-30 07:30:56.702 2 WARNING nova.virt.libvirt.driver [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:30:56 compute-0 nova_compute[189265]: 2025-09-30 07:30:56.703 2 DEBUG nova.virt.driver [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='0c6b92f5-9861-49e4-862d-3ffd84520dfa', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteStrategies-server-1986949610', uuid='a6ffd09b-ce40-4418-87d0-5555a8f04f67'), owner=OwnerMeta(userid='89ba5d19014145188ad2a3c812acdc88', username='tempest-TestExecuteStrategies-1096120513-project-admin', projectid='6431607f3dce4c88bbf6d17ee6cd45b2', projectname='tempest-TestExecuteStrategies-1096120513'), image=ImageMeta(id='0c6b92f5-9861-49e4-862d-3ffd84520dfa', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='ded17455-f8fe-40c7-8dae-6f0a2b208ae0', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "5406a6ea-407c-46b1-b791-a508e191918e", "address": "fa:16:3e:0f:9f:77", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5406a6ea-40", "ovs_interfaceid": "5406a6ea-407c-46b1-b791-a508e191918e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759217456.702965) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Sep 30 07:30:56 compute-0 nova_compute[189265]: 2025-09-30 07:30:56.707 2 DEBUG nova.virt.libvirt.host [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Sep 30 07:30:56 compute-0 nova_compute[189265]: 2025-09-30 07:30:56.707 2 DEBUG nova.virt.libvirt.host [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Sep 30 07:30:56 compute-0 nova_compute[189265]: 2025-09-30 07:30:56.710 2 DEBUG nova.virt.libvirt.host [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Sep 30 07:30:56 compute-0 nova_compute[189265]: 2025-09-30 07:30:56.710 2 DEBUG nova.virt.libvirt.host [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Sep 30 07:30:56 compute-0 nova_compute[189265]: 2025-09-30 07:30:56.711 2 DEBUG nova.virt.libvirt.driver [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 07:30:56 compute-0 nova_compute[189265]: 2025-09-30 07:30:56.711 2 DEBUG nova.virt.hardware [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T07:07:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='ded17455-f8fe-40c7-8dae-6f0a2b208ae0',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T07:07:59Z,direct_url=<?>,disk_format='qcow2',id=0c6b92f5-9861-49e4-862d-3ffd84520dfa,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4049964ce8244dacb50493f6676c6613',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T07:08:00Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Sep 30 07:30:56 compute-0 nova_compute[189265]: 2025-09-30 07:30:56.711 2 DEBUG nova.virt.hardware [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Sep 30 07:30:56 compute-0 nova_compute[189265]: 2025-09-30 07:30:56.711 2 DEBUG nova.virt.hardware [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Sep 30 07:30:56 compute-0 nova_compute[189265]: 2025-09-30 07:30:56.711 2 DEBUG nova.virt.hardware [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Sep 30 07:30:56 compute-0 nova_compute[189265]: 2025-09-30 07:30:56.712 2 DEBUG nova.virt.hardware [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Sep 30 07:30:56 compute-0 nova_compute[189265]: 2025-09-30 07:30:56.712 2 DEBUG nova.virt.hardware [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Sep 30 07:30:56 compute-0 nova_compute[189265]: 2025-09-30 07:30:56.712 2 DEBUG nova.virt.hardware [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Sep 30 07:30:56 compute-0 nova_compute[189265]: 2025-09-30 07:30:56.712 2 DEBUG nova.virt.hardware [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Sep 30 07:30:56 compute-0 nova_compute[189265]: 2025-09-30 07:30:56.712 2 DEBUG nova.virt.hardware [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Sep 30 07:30:56 compute-0 nova_compute[189265]: 2025-09-30 07:30:56.713 2 DEBUG nova.virt.hardware [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Sep 30 07:30:56 compute-0 nova_compute[189265]: 2025-09-30 07:30:56.713 2 DEBUG nova.virt.hardware [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Sep 30 07:30:56 compute-0 nova_compute[189265]: 2025-09-30 07:30:56.715 2 DEBUG nova.virt.libvirt.vif [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T07:30:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1986949610',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1986949610',id=17,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6431607f3dce4c88bbf6d17ee6cd45b2',ramdisk_id='',reservation_id='r-6sh2c7j8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1096120513',owner_user_name='tempest-TestExecuteStrategies-1096120513-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T07:30:51Z,user_data=None,user_id='89ba5d19014145188ad2a3c812acdc88',uuid=a6ffd09b-ce40-4418-87d0-5555a8f04f67,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5406a6ea-407c-46b1-b791-a508e191918e", "address": "fa:16:3e:0f:9f:77", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5406a6ea-40", "ovs_interfaceid": "5406a6ea-407c-46b1-b791-a508e191918e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 07:30:56 compute-0 nova_compute[189265]: 2025-09-30 07:30:56.715 2 DEBUG nova.network.os_vif_util [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Converting VIF {"id": "5406a6ea-407c-46b1-b791-a508e191918e", "address": "fa:16:3e:0f:9f:77", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5406a6ea-40", "ovs_interfaceid": "5406a6ea-407c-46b1-b791-a508e191918e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:30:56 compute-0 nova_compute[189265]: 2025-09-30 07:30:56.716 2 DEBUG nova.network.os_vif_util [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0f:9f:77,bridge_name='br-int',has_traffic_filtering=True,id=5406a6ea-407c-46b1-b791-a508e191918e,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5406a6ea-40') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:30:56 compute-0 nova_compute[189265]: 2025-09-30 07:30:56.717 2 DEBUG nova.objects.instance [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lazy-loading 'pci_devices' on Instance uuid a6ffd09b-ce40-4418-87d0-5555a8f04f67 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:30:56 compute-0 nova_compute[189265]: 2025-09-30 07:30:56.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:30:56 compute-0 nova_compute[189265]: 2025-09-30 07:30:56.792 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:30:57 compute-0 nova_compute[189265]: 2025-09-30 07:30:57.224 2 DEBUG nova.virt.libvirt.driver [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] End _get_guest_xml xml=<domain type="kvm">
Sep 30 07:30:57 compute-0 nova_compute[189265]:   <uuid>a6ffd09b-ce40-4418-87d0-5555a8f04f67</uuid>
Sep 30 07:30:57 compute-0 nova_compute[189265]:   <name>instance-00000011</name>
Sep 30 07:30:57 compute-0 nova_compute[189265]:   <memory>131072</memory>
Sep 30 07:30:57 compute-0 nova_compute[189265]:   <vcpu>1</vcpu>
Sep 30 07:30:57 compute-0 nova_compute[189265]:   <metadata>
Sep 30 07:30:57 compute-0 nova_compute[189265]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 07:30:57 compute-0 nova_compute[189265]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:       <nova:name>tempest-TestExecuteStrategies-server-1986949610</nova:name>
Sep 30 07:30:57 compute-0 nova_compute[189265]:       <nova:creationTime>2025-09-30 07:30:56</nova:creationTime>
Sep 30 07:30:57 compute-0 nova_compute[189265]:       <nova:flavor name="m1.nano" id="ded17455-f8fe-40c7-8dae-6f0a2b208ae0">
Sep 30 07:30:57 compute-0 nova_compute[189265]:         <nova:memory>128</nova:memory>
Sep 30 07:30:57 compute-0 nova_compute[189265]:         <nova:disk>1</nova:disk>
Sep 30 07:30:57 compute-0 nova_compute[189265]:         <nova:swap>0</nova:swap>
Sep 30 07:30:57 compute-0 nova_compute[189265]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 07:30:57 compute-0 nova_compute[189265]:         <nova:vcpus>1</nova:vcpus>
Sep 30 07:30:57 compute-0 nova_compute[189265]:         <nova:extraSpecs>
Sep 30 07:30:57 compute-0 nova_compute[189265]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 07:30:57 compute-0 nova_compute[189265]:         </nova:extraSpecs>
Sep 30 07:30:57 compute-0 nova_compute[189265]:       </nova:flavor>
Sep 30 07:30:57 compute-0 nova_compute[189265]:       <nova:image uuid="0c6b92f5-9861-49e4-862d-3ffd84520dfa">
Sep 30 07:30:57 compute-0 nova_compute[189265]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 07:30:57 compute-0 nova_compute[189265]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 07:30:57 compute-0 nova_compute[189265]:         <nova:minDisk>1</nova:minDisk>
Sep 30 07:30:57 compute-0 nova_compute[189265]:         <nova:minRam>0</nova:minRam>
Sep 30 07:30:57 compute-0 nova_compute[189265]:         <nova:properties>
Sep 30 07:30:57 compute-0 nova_compute[189265]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 07:30:57 compute-0 nova_compute[189265]:         </nova:properties>
Sep 30 07:30:57 compute-0 nova_compute[189265]:       </nova:image>
Sep 30 07:30:57 compute-0 nova_compute[189265]:       <nova:owner>
Sep 30 07:30:57 compute-0 nova_compute[189265]:         <nova:user uuid="89ba5d19014145188ad2a3c812acdc88">tempest-TestExecuteStrategies-1096120513-project-admin</nova:user>
Sep 30 07:30:57 compute-0 nova_compute[189265]:         <nova:project uuid="6431607f3dce4c88bbf6d17ee6cd45b2">tempest-TestExecuteStrategies-1096120513</nova:project>
Sep 30 07:30:57 compute-0 nova_compute[189265]:       </nova:owner>
Sep 30 07:30:57 compute-0 nova_compute[189265]:       <nova:root type="image" uuid="0c6b92f5-9861-49e4-862d-3ffd84520dfa"/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:       <nova:ports>
Sep 30 07:30:57 compute-0 nova_compute[189265]:         <nova:port uuid="5406a6ea-407c-46b1-b791-a508e191918e">
Sep 30 07:30:57 compute-0 nova_compute[189265]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:         </nova:port>
Sep 30 07:30:57 compute-0 nova_compute[189265]:       </nova:ports>
Sep 30 07:30:57 compute-0 nova_compute[189265]:     </nova:instance>
Sep 30 07:30:57 compute-0 nova_compute[189265]:   </metadata>
Sep 30 07:30:57 compute-0 nova_compute[189265]:   <sysinfo type="smbios">
Sep 30 07:30:57 compute-0 nova_compute[189265]:     <system>
Sep 30 07:30:57 compute-0 nova_compute[189265]:       <entry name="manufacturer">RDO</entry>
Sep 30 07:30:57 compute-0 nova_compute[189265]:       <entry name="product">OpenStack Compute</entry>
Sep 30 07:30:57 compute-0 nova_compute[189265]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 07:30:57 compute-0 nova_compute[189265]:       <entry name="serial">a6ffd09b-ce40-4418-87d0-5555a8f04f67</entry>
Sep 30 07:30:57 compute-0 nova_compute[189265]:       <entry name="uuid">a6ffd09b-ce40-4418-87d0-5555a8f04f67</entry>
Sep 30 07:30:57 compute-0 nova_compute[189265]:       <entry name="family">Virtual Machine</entry>
Sep 30 07:30:57 compute-0 nova_compute[189265]:     </system>
Sep 30 07:30:57 compute-0 nova_compute[189265]:   </sysinfo>
Sep 30 07:30:57 compute-0 nova_compute[189265]:   <os>
Sep 30 07:30:57 compute-0 nova_compute[189265]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 07:30:57 compute-0 nova_compute[189265]:     <boot dev="hd"/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:     <smbios mode="sysinfo"/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:   </os>
Sep 30 07:30:57 compute-0 nova_compute[189265]:   <features>
Sep 30 07:30:57 compute-0 nova_compute[189265]:     <acpi/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:     <apic/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:     <vmcoreinfo/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:   </features>
Sep 30 07:30:57 compute-0 nova_compute[189265]:   <clock offset="utc">
Sep 30 07:30:57 compute-0 nova_compute[189265]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:     <timer name="hpet" present="no"/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:   </clock>
Sep 30 07:30:57 compute-0 nova_compute[189265]:   <cpu mode="host-model" match="exact">
Sep 30 07:30:57 compute-0 nova_compute[189265]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:   </cpu>
Sep 30 07:30:57 compute-0 nova_compute[189265]:   <devices>
Sep 30 07:30:57 compute-0 nova_compute[189265]:     <disk type="file" device="disk">
Sep 30 07:30:57 compute-0 nova_compute[189265]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:       <source file="/var/lib/nova/instances/a6ffd09b-ce40-4418-87d0-5555a8f04f67/disk"/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:       <target dev="vda" bus="virtio"/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:     </disk>
Sep 30 07:30:57 compute-0 nova_compute[189265]:     <disk type="file" device="cdrom">
Sep 30 07:30:57 compute-0 nova_compute[189265]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:       <source file="/var/lib/nova/instances/a6ffd09b-ce40-4418-87d0-5555a8f04f67/disk.config"/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:       <target dev="sda" bus="sata"/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:     </disk>
Sep 30 07:30:57 compute-0 nova_compute[189265]:     <interface type="ethernet">
Sep 30 07:30:57 compute-0 nova_compute[189265]:       <mac address="fa:16:3e:0f:9f:77"/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:       <model type="virtio"/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:       <mtu size="1442"/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:       <target dev="tap5406a6ea-40"/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:     </interface>
Sep 30 07:30:57 compute-0 nova_compute[189265]:     <serial type="pty">
Sep 30 07:30:57 compute-0 nova_compute[189265]:       <log file="/var/lib/nova/instances/a6ffd09b-ce40-4418-87d0-5555a8f04f67/console.log" append="off"/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:     </serial>
Sep 30 07:30:57 compute-0 nova_compute[189265]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:     <video>
Sep 30 07:30:57 compute-0 nova_compute[189265]:       <model type="virtio"/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:     </video>
Sep 30 07:30:57 compute-0 nova_compute[189265]:     <input type="tablet" bus="usb"/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:     <rng model="virtio">
Sep 30 07:30:57 compute-0 nova_compute[189265]:       <backend model="random">/dev/urandom</backend>
Sep 30 07:30:57 compute-0 nova_compute[189265]:     </rng>
Sep 30 07:30:57 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root"/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:     <controller type="usb" index="0"/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 07:30:57 compute-0 nova_compute[189265]:       <stats period="10"/>
Sep 30 07:30:57 compute-0 nova_compute[189265]:     </memballoon>
Sep 30 07:30:57 compute-0 nova_compute[189265]:   </devices>
Sep 30 07:30:57 compute-0 nova_compute[189265]: </domain>
Sep 30 07:30:57 compute-0 nova_compute[189265]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Sep 30 07:30:57 compute-0 nova_compute[189265]: 2025-09-30 07:30:57.226 2 DEBUG nova.compute.manager [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Preparing to wait for external event network-vif-plugged-5406a6ea-407c-46b1-b791-a508e191918e prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 07:30:57 compute-0 nova_compute[189265]: 2025-09-30 07:30:57.226 2 DEBUG oslo_concurrency.lockutils [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquiring lock "a6ffd09b-ce40-4418-87d0-5555a8f04f67-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:30:57 compute-0 nova_compute[189265]: 2025-09-30 07:30:57.226 2 DEBUG oslo_concurrency.lockutils [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "a6ffd09b-ce40-4418-87d0-5555a8f04f67-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:30:57 compute-0 nova_compute[189265]: 2025-09-30 07:30:57.226 2 DEBUG oslo_concurrency.lockutils [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "a6ffd09b-ce40-4418-87d0-5555a8f04f67-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:30:57 compute-0 nova_compute[189265]: 2025-09-30 07:30:57.227 2 DEBUG nova.virt.libvirt.vif [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T07:30:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1986949610',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1986949610',id=17,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6431607f3dce4c88bbf6d17ee6cd45b2',ramdisk_id='',reservation_id='r-6sh2c7j8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1096120513',owner_user_name='tempest-TestExecuteStrategies-1096120513-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T07:30:51Z,user_data=None,user_id='89ba5d19014145188ad2a3c812acdc88',uuid=a6ffd09b-ce40-4418-87d0-5555a8f04f67,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5406a6ea-407c-46b1-b791-a508e191918e", "address": "fa:16:3e:0f:9f:77", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5406a6ea-40", "ovs_interfaceid": "5406a6ea-407c-46b1-b791-a508e191918e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 07:30:57 compute-0 nova_compute[189265]: 2025-09-30 07:30:57.227 2 DEBUG nova.network.os_vif_util [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Converting VIF {"id": "5406a6ea-407c-46b1-b791-a508e191918e", "address": "fa:16:3e:0f:9f:77", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5406a6ea-40", "ovs_interfaceid": "5406a6ea-407c-46b1-b791-a508e191918e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:30:57 compute-0 nova_compute[189265]: 2025-09-30 07:30:57.227 2 DEBUG nova.network.os_vif_util [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0f:9f:77,bridge_name='br-int',has_traffic_filtering=True,id=5406a6ea-407c-46b1-b791-a508e191918e,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5406a6ea-40') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:30:57 compute-0 nova_compute[189265]: 2025-09-30 07:30:57.228 2 DEBUG os_vif [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0f:9f:77,bridge_name='br-int',has_traffic_filtering=True,id=5406a6ea-407c-46b1-b791-a508e191918e,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5406a6ea-40') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 07:30:57 compute-0 nova_compute[189265]: 2025-09-30 07:30:57.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:30:57 compute-0 nova_compute[189265]: 2025-09-30 07:30:57.228 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:30:57 compute-0 nova_compute[189265]: 2025-09-30 07:30:57.229 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:30:57 compute-0 nova_compute[189265]: 2025-09-30 07:30:57.229 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:30:57 compute-0 nova_compute[189265]: 2025-09-30 07:30:57.229 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '9d2182f9-6f36-5ac9-af1b-b52dd9d872c1', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:30:57 compute-0 nova_compute[189265]: 2025-09-30 07:30:57.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:30:57 compute-0 nova_compute[189265]: 2025-09-30 07:30:57.265 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:30:57 compute-0 nova_compute[189265]: 2025-09-30 07:30:57.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:30:57 compute-0 nova_compute[189265]: 2025-09-30 07:30:57.268 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5406a6ea-40, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:30:57 compute-0 nova_compute[189265]: 2025-09-30 07:30:57.268 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap5406a6ea-40, col_values=(('qos', UUID('087ad3a2-0c35-4ece-8dde-3b866f861316')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:30:57 compute-0 nova_compute[189265]: 2025-09-30 07:30:57.268 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap5406a6ea-40, col_values=(('external_ids', {'iface-id': '5406a6ea-407c-46b1-b791-a508e191918e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0f:9f:77', 'vm-uuid': 'a6ffd09b-ce40-4418-87d0-5555a8f04f67'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:30:57 compute-0 nova_compute[189265]: 2025-09-30 07:30:57.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:30:57 compute-0 NetworkManager[51813]: <info>  [1759217457.2707] manager: (tap5406a6ea-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/57)
Sep 30 07:30:57 compute-0 nova_compute[189265]: 2025-09-30 07:30:57.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:30:57 compute-0 nova_compute[189265]: 2025-09-30 07:30:57.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:30:57 compute-0 nova_compute[189265]: 2025-09-30 07:30:57.275 2 INFO os_vif [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0f:9f:77,bridge_name='br-int',has_traffic_filtering=True,id=5406a6ea-407c-46b1-b791-a508e191918e,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5406a6ea-40')
Sep 30 07:30:58 compute-0 nova_compute[189265]: 2025-09-30 07:30:58.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:30:58 compute-0 nova_compute[189265]: 2025-09-30 07:30:58.815 2 DEBUG nova.virt.libvirt.driver [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 07:30:58 compute-0 nova_compute[189265]: 2025-09-30 07:30:58.815 2 DEBUG nova.virt.libvirt.driver [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 07:30:58 compute-0 nova_compute[189265]: 2025-09-30 07:30:58.816 2 DEBUG nova.virt.libvirt.driver [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] No VIF found with MAC fa:16:3e:0f:9f:77, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 07:30:58 compute-0 nova_compute[189265]: 2025-09-30 07:30:58.817 2 INFO nova.virt.libvirt.driver [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Using config drive
Sep 30 07:30:59 compute-0 nova_compute[189265]: 2025-09-30 07:30:59.336 2 WARNING neutronclient.v2_0.client [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:30:59 compute-0 podman[218896]: 2025-09-30 07:30:59.505653583 +0000 UTC m=+0.087442809 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 07:30:59 compute-0 podman[199733]: time="2025-09-30T07:30:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:30:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:30:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:30:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:30:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3008 "" "Go-http-client/1.1"
Sep 30 07:30:59 compute-0 nova_compute[189265]: 2025-09-30 07:30:59.784 2 INFO nova.virt.libvirt.driver [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Creating config drive at /var/lib/nova/instances/a6ffd09b-ce40-4418-87d0-5555a8f04f67/disk.config
Sep 30 07:30:59 compute-0 nova_compute[189265]: 2025-09-30 07:30:59.794 2 DEBUG oslo_concurrency.processutils [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a6ffd09b-ce40-4418-87d0-5555a8f04f67/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmp4hqv2_zn execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:30:59 compute-0 nova_compute[189265]: 2025-09-30 07:30:59.939 2 DEBUG oslo_concurrency.processutils [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a6ffd09b-ce40-4418-87d0-5555a8f04f67/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmp4hqv2_zn" returned: 0 in 0.144s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:31:00 compute-0 kernel: tap5406a6ea-40: entered promiscuous mode
Sep 30 07:31:00 compute-0 NetworkManager[51813]: <info>  [1759217460.0414] manager: (tap5406a6ea-40): new Tun device (/org/freedesktop/NetworkManager/Devices/58)
Sep 30 07:31:00 compute-0 ovn_controller[91436]: 2025-09-30T07:31:00Z|00160|binding|INFO|Claiming lport 5406a6ea-407c-46b1-b791-a508e191918e for this chassis.
Sep 30 07:31:00 compute-0 ovn_controller[91436]: 2025-09-30T07:31:00Z|00161|binding|INFO|5406a6ea-407c-46b1-b791-a508e191918e: Claiming fa:16:3e:0f:9f:77 10.100.0.8
Sep 30 07:31:00 compute-0 nova_compute[189265]: 2025-09-30 07:31:00.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:00.056 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0f:9f:77 10.100.0.8'], port_security=['fa:16:3e:0f:9f:77 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'a6ffd09b-ce40-4418-87d0-5555a8f04f67', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c99c822b-3191-49e5-b938-903e25b4a9bb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6431607f3dce4c88bbf6d17ee6cd45b2', 'neutron:revision_number': '4', 'neutron:security_group_ids': '39e9818d-6ede-4a3d-b6e2-a5ad3a4c803a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0bbcb02d-e040-4e0e-9a60-6466c4420133, chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], logical_port=5406a6ea-407c-46b1-b791-a508e191918e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:00.057 100322 INFO neutron.agent.ovn.metadata.agent [-] Port 5406a6ea-407c-46b1-b791-a508e191918e in datapath c99c822b-3191-49e5-b938-903e25b4a9bb bound to our chassis
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:00.059 100322 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c99c822b-3191-49e5-b938-903e25b4a9bb
Sep 30 07:31:00 compute-0 ovn_controller[91436]: 2025-09-30T07:31:00Z|00162|binding|INFO|Setting lport 5406a6ea-407c-46b1-b791-a508e191918e ovn-installed in OVS
Sep 30 07:31:00 compute-0 ovn_controller[91436]: 2025-09-30T07:31:00Z|00163|binding|INFO|Setting lport 5406a6ea-407c-46b1-b791-a508e191918e up in Southbound
Sep 30 07:31:00 compute-0 nova_compute[189265]: 2025-09-30 07:31:00.068 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:00 compute-0 systemd-udevd[218938]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 07:31:00 compute-0 nova_compute[189265]: 2025-09-30 07:31:00.074 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:00.081 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[2ebd87e0-5334-4d23-b58e-e78cfd7c6bd5]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:00.082 100322 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc99c822b-31 in ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:00.083 210650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc99c822b-30 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:00.083 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[3f8f60a4-d9be-4165-a073-4b79e482b24f]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:00.085 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[ec4e1116-2c69-414e-bcd8-a1fc447c6ceb]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:31:00 compute-0 NetworkManager[51813]: <info>  [1759217460.0963] device (tap5406a6ea-40): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 07:31:00 compute-0 NetworkManager[51813]: <info>  [1759217460.0978] device (tap5406a6ea-40): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:00.101 100440 DEBUG oslo.privsep.daemon [-] privsep: reply[83ff8525-003b-446f-84fe-02f2a1ed5361]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:31:00 compute-0 systemd-machined[149233]: New machine qemu-12-instance-00000011.
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:00.117 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[5bb6d08e-e951-4d56-ac13-4b9bcbe3c370]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:31:00 compute-0 systemd[1]: Started Virtual Machine qemu-12-instance-00000011.
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:00.159 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[f8b7cab2-d4a3-432d-933d-6a1e04ff9c97]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:00.164 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[fcd5a401-d188-45a8-9cda-6de99121b11e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:31:00 compute-0 NetworkManager[51813]: <info>  [1759217460.1656] manager: (tapc99c822b-30): new Veth device (/org/freedesktop/NetworkManager/Devices/59)
Sep 30 07:31:00 compute-0 systemd-udevd[218944]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:00.209 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[1fa54351-cc2f-4629-b81b-f1db54e02f88]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:00.212 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[5c131c7e-61fc-4ba9-9ef6-0da64ca77743]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:31:00 compute-0 NetworkManager[51813]: <info>  [1759217460.2361] device (tapc99c822b-30): carrier: link connected
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:00.242 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[ba7d0ed1-44a1-463a-b5e0-97f1d4d92491]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:00.259 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[544a0694-bce6-475a-98ca-b81e46e113d0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc99c822b-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:67:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 531795, 'reachable_time': 33158, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218973, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:00.272 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[4b7a954f-0073-44ef-b865-bf1520a26d94]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe09:678c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 531795, 'tstamp': 531795}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218974, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:00.287 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[5331918f-288a-4bc3-8963-da775b895b4f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc99c822b-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:67:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 531795, 'reachable_time': 33158, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218975, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:00.315 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[8bcedc7f-40cc-4dd9-8d56-b2879ff1060f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:00.378 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[d5cdd1f1-1e3c-4ee7-8033-ee0aa24995a9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:00.379 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc99c822b-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:00.379 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:00.380 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc99c822b-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:31:00 compute-0 NetworkManager[51813]: <info>  [1759217460.3819] manager: (tapc99c822b-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/60)
Sep 30 07:31:00 compute-0 nova_compute[189265]: 2025-09-30 07:31:00.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:00 compute-0 kernel: tapc99c822b-30: entered promiscuous mode
Sep 30 07:31:00 compute-0 nova_compute[189265]: 2025-09-30 07:31:00.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:00.385 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc99c822b-30, col_values=(('external_ids', {'iface-id': '67b7df48-3f38-444a-8506-1c0ec5bd1d15'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:31:00 compute-0 nova_compute[189265]: 2025-09-30 07:31:00.386 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:00 compute-0 nova_compute[189265]: 2025-09-30 07:31:00.386 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:00 compute-0 ovn_controller[91436]: 2025-09-30T07:31:00Z|00164|binding|INFO|Releasing lport 67b7df48-3f38-444a-8506-1c0ec5bd1d15 from this chassis (sb_readonly=0)
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:00.389 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[13f945cc-b4c8-4fe0-969c-9649f0cf8d4d]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:00.389 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:00.389 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:00.390 100322 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for c99c822b-3191-49e5-b938-903e25b4a9bb disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:00.390 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:00.390 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[6a6749e9-97c2-4a80-973d-d7ffb72401e8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:00.390 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:00.391 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[834c3c5b-ea67-4e7b-9685-4634bbcf000f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:00.391 100322 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]: global
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]:     log         /dev/log local0 debug
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]:     log-tag     haproxy-metadata-proxy-c99c822b-3191-49e5-b938-903e25b4a9bb
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]:     user        root
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]:     group       root
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]:     maxconn     1024
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]:     pidfile     /var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]:     daemon
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]: 
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]: defaults
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]:     log global
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]:     mode http
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]:     option httplog
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]:     option dontlognull
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]:     option http-server-close
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]:     option forwardfor
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]:     retries                 3
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]:     timeout http-request    30s
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]:     timeout connect         30s
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]:     timeout client          32s
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]:     timeout server          32s
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]:     timeout http-keep-alive 30s
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]: 
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]: listen listener
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]:     bind 169.254.169.254:80
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]:     
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]: 
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]:     http-request add-header X-OVN-Network-ID c99c822b-3191-49e5-b938-903e25b4a9bb
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Sep 30 07:31:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:00.392 100322 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'env', 'PROCESS_TAG=haproxy-c99c822b-3191-49e5-b938-903e25b4a9bb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c99c822b-3191-49e5-b938-903e25b4a9bb.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Sep 30 07:31:00 compute-0 nova_compute[189265]: 2025-09-30 07:31:00.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:00 compute-0 podman[219011]: 2025-09-30 07:31:00.7562943 +0000 UTC m=+0.049883954 container create 7ba814fe689129cd63077f59b73aba0bf7f44e7c874b8e5102721e0e554267f8 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930)
Sep 30 07:31:00 compute-0 systemd[1]: Started libpod-conmon-7ba814fe689129cd63077f59b73aba0bf7f44e7c874b8e5102721e0e554267f8.scope.
Sep 30 07:31:00 compute-0 systemd[1]: Started libcrun container.
Sep 30 07:31:00 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5779053d06151f4b9c6748a2ecda1c9ff8c0d351cb69fb72302c73565796c6ef/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 07:31:00 compute-0 podman[219011]: 2025-09-30 07:31:00.728540084 +0000 UTC m=+0.022129758 image pull eeebcc09bc72f81ab45f5ab87eb8f6a7b554b949227aeec082bdb0732754ddc8 38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 07:31:00 compute-0 podman[219011]: 2025-09-30 07:31:00.839358964 +0000 UTC m=+0.132948628 container init 7ba814fe689129cd63077f59b73aba0bf7f44e7c874b8e5102721e0e554267f8 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 07:31:00 compute-0 podman[219011]: 2025-09-30 07:31:00.845862358 +0000 UTC m=+0.139452002 container start 7ba814fe689129cd63077f59b73aba0bf7f44e7c874b8e5102721e0e554267f8 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0)
Sep 30 07:31:00 compute-0 neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb[219026]: [NOTICE]   (219031) : New worker (219033) forked
Sep 30 07:31:00 compute-0 neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb[219026]: [NOTICE]   (219031) : Loading success.
Sep 30 07:31:00 compute-0 nova_compute[189265]: 2025-09-30 07:31:00.899 2 DEBUG nova.compute.manager [req-ff04f794-132e-4310-8088-6a2e44ccd3b1 req-e6f3d200-9600-4a5f-ace4-121fee5db713 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Received event network-vif-plugged-5406a6ea-407c-46b1-b791-a508e191918e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:31:00 compute-0 nova_compute[189265]: 2025-09-30 07:31:00.899 2 DEBUG oslo_concurrency.lockutils [req-ff04f794-132e-4310-8088-6a2e44ccd3b1 req-e6f3d200-9600-4a5f-ace4-121fee5db713 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "a6ffd09b-ce40-4418-87d0-5555a8f04f67-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:31:00 compute-0 nova_compute[189265]: 2025-09-30 07:31:00.900 2 DEBUG oslo_concurrency.lockutils [req-ff04f794-132e-4310-8088-6a2e44ccd3b1 req-e6f3d200-9600-4a5f-ace4-121fee5db713 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "a6ffd09b-ce40-4418-87d0-5555a8f04f67-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:31:00 compute-0 nova_compute[189265]: 2025-09-30 07:31:00.900 2 DEBUG oslo_concurrency.lockutils [req-ff04f794-132e-4310-8088-6a2e44ccd3b1 req-e6f3d200-9600-4a5f-ace4-121fee5db713 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "a6ffd09b-ce40-4418-87d0-5555a8f04f67-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:31:00 compute-0 nova_compute[189265]: 2025-09-30 07:31:00.900 2 DEBUG nova.compute.manager [req-ff04f794-132e-4310-8088-6a2e44ccd3b1 req-e6f3d200-9600-4a5f-ace4-121fee5db713 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Processing event network-vif-plugged-5406a6ea-407c-46b1-b791-a508e191918e _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 07:31:00 compute-0 nova_compute[189265]: 2025-09-30 07:31:00.901 2 DEBUG nova.compute.manager [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 07:31:00 compute-0 nova_compute[189265]: 2025-09-30 07:31:00.904 2 DEBUG nova.virt.libvirt.driver [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Sep 30 07:31:00 compute-0 nova_compute[189265]: 2025-09-30 07:31:00.906 2 INFO nova.virt.libvirt.driver [-] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Instance spawned successfully.
Sep 30 07:31:00 compute-0 nova_compute[189265]: 2025-09-30 07:31:00.906 2 DEBUG nova.virt.libvirt.driver [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Sep 30 07:31:01 compute-0 openstack_network_exporter[201859]: ERROR   07:31:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:31:01 compute-0 openstack_network_exporter[201859]: ERROR   07:31:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:31:01 compute-0 openstack_network_exporter[201859]: ERROR   07:31:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:31:01 compute-0 openstack_network_exporter[201859]: ERROR   07:31:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:31:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:31:01 compute-0 openstack_network_exporter[201859]: ERROR   07:31:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:31:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:31:01 compute-0 nova_compute[189265]: 2025-09-30 07:31:01.418 2 DEBUG nova.virt.libvirt.driver [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:31:01 compute-0 nova_compute[189265]: 2025-09-30 07:31:01.419 2 DEBUG nova.virt.libvirt.driver [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:31:01 compute-0 nova_compute[189265]: 2025-09-30 07:31:01.419 2 DEBUG nova.virt.libvirt.driver [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:31:01 compute-0 nova_compute[189265]: 2025-09-30 07:31:01.420 2 DEBUG nova.virt.libvirt.driver [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:31:01 compute-0 nova_compute[189265]: 2025-09-30 07:31:01.420 2 DEBUG nova.virt.libvirt.driver [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:31:01 compute-0 nova_compute[189265]: 2025-09-30 07:31:01.421 2 DEBUG nova.virt.libvirt.driver [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:31:01 compute-0 nova_compute[189265]: 2025-09-30 07:31:01.931 2 INFO nova.compute.manager [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Took 9.52 seconds to spawn the instance on the hypervisor.
Sep 30 07:31:01 compute-0 nova_compute[189265]: 2025-09-30 07:31:01.932 2 DEBUG nova.compute.manager [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 07:31:02 compute-0 nova_compute[189265]: 2025-09-30 07:31:02.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:02 compute-0 nova_compute[189265]: 2025-09-30 07:31:02.468 2 INFO nova.compute.manager [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Took 14.77 seconds to build instance.
Sep 30 07:31:02 compute-0 nova_compute[189265]: 2025-09-30 07:31:02.961 2 DEBUG nova.compute.manager [req-229e977b-e97d-4276-b5d3-3a7a6bdee2c7 req-f83eb793-2010-40de-b319-eaebd246893e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Received event network-vif-plugged-5406a6ea-407c-46b1-b791-a508e191918e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:31:02 compute-0 nova_compute[189265]: 2025-09-30 07:31:02.962 2 DEBUG oslo_concurrency.lockutils [req-229e977b-e97d-4276-b5d3-3a7a6bdee2c7 req-f83eb793-2010-40de-b319-eaebd246893e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "a6ffd09b-ce40-4418-87d0-5555a8f04f67-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:31:02 compute-0 nova_compute[189265]: 2025-09-30 07:31:02.962 2 DEBUG oslo_concurrency.lockutils [req-229e977b-e97d-4276-b5d3-3a7a6bdee2c7 req-f83eb793-2010-40de-b319-eaebd246893e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "a6ffd09b-ce40-4418-87d0-5555a8f04f67-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:31:02 compute-0 nova_compute[189265]: 2025-09-30 07:31:02.963 2 DEBUG oslo_concurrency.lockutils [req-229e977b-e97d-4276-b5d3-3a7a6bdee2c7 req-f83eb793-2010-40de-b319-eaebd246893e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "a6ffd09b-ce40-4418-87d0-5555a8f04f67-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:31:02 compute-0 nova_compute[189265]: 2025-09-30 07:31:02.963 2 DEBUG nova.compute.manager [req-229e977b-e97d-4276-b5d3-3a7a6bdee2c7 req-f83eb793-2010-40de-b319-eaebd246893e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] No waiting events found dispatching network-vif-plugged-5406a6ea-407c-46b1-b791-a508e191918e pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:31:02 compute-0 nova_compute[189265]: 2025-09-30 07:31:02.963 2 WARNING nova.compute.manager [req-229e977b-e97d-4276-b5d3-3a7a6bdee2c7 req-f83eb793-2010-40de-b319-eaebd246893e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Received unexpected event network-vif-plugged-5406a6ea-407c-46b1-b791-a508e191918e for instance with vm_state active and task_state None.
Sep 30 07:31:02 compute-0 nova_compute[189265]: 2025-09-30 07:31:02.973 2 DEBUG oslo_concurrency.lockutils [None req-b7aff384-05f8-48bf-902b-97300ae43d99 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "a6ffd09b-ce40-4418-87d0-5555a8f04f67" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.321s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:31:03 compute-0 nova_compute[189265]: 2025-09-30 07:31:03.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:07 compute-0 nova_compute[189265]: 2025-09-30 07:31:07.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:08 compute-0 nova_compute[189265]: 2025-09-30 07:31:08.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:08 compute-0 podman[219042]: 2025-09-30 07:31:08.490698402 +0000 UTC m=+0.067428061 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Sep 30 07:31:11 compute-0 podman[219082]: 2025-09-30 07:31:11.485174731 +0000 UTC m=+0.076307483 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, com.redhat.component=ubi9-minimal-container, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, maintainer=Red Hat, Inc., release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, name=ubi9-minimal, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible)
Sep 30 07:31:12 compute-0 nova_compute[189265]: 2025-09-30 07:31:12.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:13 compute-0 ovn_controller[91436]: 2025-09-30T07:31:13Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:0f:9f:77 10.100.0.8
Sep 30 07:31:13 compute-0 ovn_controller[91436]: 2025-09-30T07:31:13Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0f:9f:77 10.100.0.8
Sep 30 07:31:13 compute-0 nova_compute[189265]: 2025-09-30 07:31:13.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:15 compute-0 podman[219103]: 2025-09-30 07:31:15.487537737 +0000 UTC m=+0.059876596 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=multipathd)
Sep 30 07:31:15 compute-0 podman[219104]: 2025-09-30 07:31:15.487598499 +0000 UTC m=+0.056476651 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Sep 30 07:31:15 compute-0 podman[219105]: 2025-09-30 07:31:15.520355127 +0000 UTC m=+0.086355128 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0)
Sep 30 07:31:15 compute-0 nova_compute[189265]: 2025-09-30 07:31:15.839 2 DEBUG nova.virt.libvirt.driver [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 8b5264fb-4374-456c-aa18-276d431aa425] Creating tmpfile /var/lib/nova/instances/tmpk2r093zv to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Sep 30 07:31:15 compute-0 nova_compute[189265]: 2025-09-30 07:31:15.840 2 WARNING neutronclient.v2_0.client [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:31:15 compute-0 nova_compute[189265]: 2025-09-30 07:31:15.922 2 DEBUG nova.compute.manager [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpk2r093zv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Sep 30 07:31:17 compute-0 nova_compute[189265]: 2025-09-30 07:31:17.366 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:17 compute-0 nova_compute[189265]: 2025-09-30 07:31:17.961 2 WARNING neutronclient.v2_0.client [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:31:18 compute-0 nova_compute[189265]: 2025-09-30 07:31:18.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:20.562 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:31:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:20.562 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:31:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:20.563 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:31:22 compute-0 nova_compute[189265]: 2025-09-30 07:31:22.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:23 compute-0 nova_compute[189265]: 2025-09-30 07:31:23.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:23 compute-0 nova_compute[189265]: 2025-09-30 07:31:23.548 2 DEBUG nova.compute.manager [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpk2r093zv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='8b5264fb-4374-456c-aa18-276d431aa425',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Sep 30 07:31:24 compute-0 nova_compute[189265]: 2025-09-30 07:31:24.567 2 DEBUG oslo_concurrency.lockutils [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "refresh_cache-8b5264fb-4374-456c-aa18-276d431aa425" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:31:24 compute-0 nova_compute[189265]: 2025-09-30 07:31:24.568 2 DEBUG oslo_concurrency.lockutils [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquired lock "refresh_cache-8b5264fb-4374-456c-aa18-276d431aa425" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:31:24 compute-0 nova_compute[189265]: 2025-09-30 07:31:24.568 2 DEBUG nova.network.neutron [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 8b5264fb-4374-456c-aa18-276d431aa425] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 07:31:25 compute-0 nova_compute[189265]: 2025-09-30 07:31:25.076 2 WARNING neutronclient.v2_0.client [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:31:25 compute-0 nova_compute[189265]: 2025-09-30 07:31:25.590 2 WARNING neutronclient.v2_0.client [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:31:25 compute-0 nova_compute[189265]: 2025-09-30 07:31:25.830 2 DEBUG nova.network.neutron [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 8b5264fb-4374-456c-aa18-276d431aa425] Updating instance_info_cache with network_info: [{"id": "f3eac6a4-578b-4544-b899-b34007452c34", "address": "fa:16:3e:2e:65:44", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3eac6a4-57", "ovs_interfaceid": "f3eac6a4-578b-4544-b899-b34007452c34", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:31:26 compute-0 nova_compute[189265]: 2025-09-30 07:31:26.348 2 DEBUG oslo_concurrency.lockutils [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Releasing lock "refresh_cache-8b5264fb-4374-456c-aa18-276d431aa425" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:31:26 compute-0 nova_compute[189265]: 2025-09-30 07:31:26.377 2 DEBUG nova.virt.libvirt.driver [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 8b5264fb-4374-456c-aa18-276d431aa425] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpk2r093zv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='8b5264fb-4374-456c-aa18-276d431aa425',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Sep 30 07:31:26 compute-0 nova_compute[189265]: 2025-09-30 07:31:26.378 2 DEBUG nova.virt.libvirt.driver [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 8b5264fb-4374-456c-aa18-276d431aa425] Creating instance directory: /var/lib/nova/instances/8b5264fb-4374-456c-aa18-276d431aa425 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Sep 30 07:31:26 compute-0 nova_compute[189265]: 2025-09-30 07:31:26.378 2 DEBUG nova.virt.libvirt.driver [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 8b5264fb-4374-456c-aa18-276d431aa425] Creating disk.info with the contents: {'/var/lib/nova/instances/8b5264fb-4374-456c-aa18-276d431aa425/disk': 'qcow2', '/var/lib/nova/instances/8b5264fb-4374-456c-aa18-276d431aa425/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Sep 30 07:31:26 compute-0 nova_compute[189265]: 2025-09-30 07:31:26.379 2 DEBUG nova.virt.libvirt.driver [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 8b5264fb-4374-456c-aa18-276d431aa425] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Sep 30 07:31:26 compute-0 nova_compute[189265]: 2025-09-30 07:31:26.380 2 DEBUG nova.objects.instance [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lazy-loading 'trusted_certs' on Instance uuid 8b5264fb-4374-456c-aa18-276d431aa425 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:31:26 compute-0 nova_compute[189265]: 2025-09-30 07:31:26.890 2 DEBUG oslo_utils.imageutils.format_inspector [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:31:26 compute-0 nova_compute[189265]: 2025-09-30 07:31:26.894 2 DEBUG oslo_utils.imageutils.format_inspector [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:31:26 compute-0 nova_compute[189265]: 2025-09-30 07:31:26.897 2 DEBUG oslo_concurrency.processutils [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:31:26 compute-0 nova_compute[189265]: 2025-09-30 07:31:26.965 2 DEBUG oslo_concurrency.processutils [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:31:26 compute-0 nova_compute[189265]: 2025-09-30 07:31:26.967 2 DEBUG oslo_concurrency.lockutils [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "649c128805005f3dfb5a93843c58a367cdfe939d" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:31:26 compute-0 nova_compute[189265]: 2025-09-30 07:31:26.968 2 DEBUG oslo_concurrency.lockutils [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "649c128805005f3dfb5a93843c58a367cdfe939d" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:31:26 compute-0 nova_compute[189265]: 2025-09-30 07:31:26.968 2 DEBUG oslo_utils.imageutils.format_inspector [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:31:26 compute-0 nova_compute[189265]: 2025-09-30 07:31:26.975 2 DEBUG oslo_utils.imageutils.format_inspector [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:31:26 compute-0 nova_compute[189265]: 2025-09-30 07:31:26.976 2 DEBUG oslo_concurrency.processutils [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:31:27 compute-0 nova_compute[189265]: 2025-09-30 07:31:27.043 2 DEBUG oslo_concurrency.processutils [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:31:27 compute-0 nova_compute[189265]: 2025-09-30 07:31:27.044 2 DEBUG oslo_concurrency.processutils [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d,backing_fmt=raw /var/lib/nova/instances/8b5264fb-4374-456c-aa18-276d431aa425/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:31:27 compute-0 nova_compute[189265]: 2025-09-30 07:31:27.097 2 DEBUG oslo_concurrency.processutils [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d,backing_fmt=raw /var/lib/nova/instances/8b5264fb-4374-456c-aa18-276d431aa425/disk 1073741824" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:31:27 compute-0 nova_compute[189265]: 2025-09-30 07:31:27.098 2 DEBUG oslo_concurrency.lockutils [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "649c128805005f3dfb5a93843c58a367cdfe939d" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.130s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:31:27 compute-0 nova_compute[189265]: 2025-09-30 07:31:27.098 2 DEBUG oslo_concurrency.processutils [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:31:27 compute-0 nova_compute[189265]: 2025-09-30 07:31:27.165 2 DEBUG oslo_concurrency.processutils [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:31:27 compute-0 nova_compute[189265]: 2025-09-30 07:31:27.167 2 DEBUG nova.virt.disk.api [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Checking if we can resize image /var/lib/nova/instances/8b5264fb-4374-456c-aa18-276d431aa425/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Sep 30 07:31:27 compute-0 nova_compute[189265]: 2025-09-30 07:31:27.167 2 DEBUG oslo_concurrency.processutils [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8b5264fb-4374-456c-aa18-276d431aa425/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:31:27 compute-0 nova_compute[189265]: 2025-09-30 07:31:27.257 2 DEBUG oslo_concurrency.processutils [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8b5264fb-4374-456c-aa18-276d431aa425/disk --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:31:27 compute-0 nova_compute[189265]: 2025-09-30 07:31:27.259 2 DEBUG nova.virt.disk.api [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Cannot resize image /var/lib/nova/instances/8b5264fb-4374-456c-aa18-276d431aa425/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Sep 30 07:31:27 compute-0 nova_compute[189265]: 2025-09-30 07:31:27.260 2 DEBUG nova.objects.instance [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lazy-loading 'migration_context' on Instance uuid 8b5264fb-4374-456c-aa18-276d431aa425 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:31:27 compute-0 nova_compute[189265]: 2025-09-30 07:31:27.372 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:27 compute-0 nova_compute[189265]: 2025-09-30 07:31:27.793 2 DEBUG nova.objects.base [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Object Instance<8b5264fb-4374-456c-aa18-276d431aa425> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Sep 30 07:31:27 compute-0 nova_compute[189265]: 2025-09-30 07:31:27.794 2 DEBUG oslo_concurrency.processutils [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/8b5264fb-4374-456c-aa18-276d431aa425/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:31:27 compute-0 nova_compute[189265]: 2025-09-30 07:31:27.834 2 DEBUG oslo_concurrency.processutils [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/8b5264fb-4374-456c-aa18-276d431aa425/disk.config 497664" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:31:27 compute-0 nova_compute[189265]: 2025-09-30 07:31:27.835 2 DEBUG nova.virt.libvirt.driver [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 8b5264fb-4374-456c-aa18-276d431aa425] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Sep 30 07:31:27 compute-0 nova_compute[189265]: 2025-09-30 07:31:27.838 2 DEBUG nova.virt.libvirt.vif [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T07:30:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-484805115',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-484805115',id=16,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T07:30:40Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6431607f3dce4c88bbf6d17ee6cd45b2',ramdisk_id='',reservation_id='r-ugqkt2rz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1096120513',owner_user_name='tempest-TestExecuteStrategies-1096120513-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T07:30:40Z,user_data=None,user_id='89ba5d19014145188ad2a3c812acdc88',uuid=8b5264fb-4374-456c-aa18-276d431aa425,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f3eac6a4-578b-4544-b899-b34007452c34", "address": "fa:16:3e:2e:65:44", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapf3eac6a4-57", "ovs_interfaceid": "f3eac6a4-578b-4544-b899-b34007452c34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 07:31:27 compute-0 nova_compute[189265]: 2025-09-30 07:31:27.839 2 DEBUG nova.network.os_vif_util [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Converting VIF {"id": "f3eac6a4-578b-4544-b899-b34007452c34", "address": "fa:16:3e:2e:65:44", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapf3eac6a4-57", "ovs_interfaceid": "f3eac6a4-578b-4544-b899-b34007452c34", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:31:27 compute-0 nova_compute[189265]: 2025-09-30 07:31:27.840 2 DEBUG nova.network.os_vif_util [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2e:65:44,bridge_name='br-int',has_traffic_filtering=True,id=f3eac6a4-578b-4544-b899-b34007452c34,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf3eac6a4-57') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:31:27 compute-0 nova_compute[189265]: 2025-09-30 07:31:27.841 2 DEBUG os_vif [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2e:65:44,bridge_name='br-int',has_traffic_filtering=True,id=f3eac6a4-578b-4544-b899-b34007452c34,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf3eac6a4-57') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 07:31:27 compute-0 nova_compute[189265]: 2025-09-30 07:31:27.842 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:27 compute-0 nova_compute[189265]: 2025-09-30 07:31:27.843 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:31:27 compute-0 nova_compute[189265]: 2025-09-30 07:31:27.843 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:31:27 compute-0 nova_compute[189265]: 2025-09-30 07:31:27.845 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:27 compute-0 nova_compute[189265]: 2025-09-30 07:31:27.845 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '8048f26f-1524-51ba-86ea-bd568bd23bc0', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:31:27 compute-0 nova_compute[189265]: 2025-09-30 07:31:27.886 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:27 compute-0 nova_compute[189265]: 2025-09-30 07:31:27.888 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:27 compute-0 nova_compute[189265]: 2025-09-30 07:31:27.891 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:27 compute-0 nova_compute[189265]: 2025-09-30 07:31:27.891 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf3eac6a4-57, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:31:27 compute-0 nova_compute[189265]: 2025-09-30 07:31:27.892 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapf3eac6a4-57, col_values=(('qos', UUID('d5ccc00d-52d1-4ed8-a79f-cf0fd613c6a0')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:31:27 compute-0 nova_compute[189265]: 2025-09-30 07:31:27.893 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapf3eac6a4-57, col_values=(('external_ids', {'iface-id': 'f3eac6a4-578b-4544-b899-b34007452c34', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2e:65:44', 'vm-uuid': '8b5264fb-4374-456c-aa18-276d431aa425'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:31:27 compute-0 nova_compute[189265]: 2025-09-30 07:31:27.894 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:27 compute-0 NetworkManager[51813]: <info>  [1759217487.8958] manager: (tapf3eac6a4-57): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/61)
Sep 30 07:31:27 compute-0 nova_compute[189265]: 2025-09-30 07:31:27.897 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:31:27 compute-0 nova_compute[189265]: 2025-09-30 07:31:27.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:27 compute-0 nova_compute[189265]: 2025-09-30 07:31:27.904 2 INFO os_vif [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2e:65:44,bridge_name='br-int',has_traffic_filtering=True,id=f3eac6a4-578b-4544-b899-b34007452c34,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf3eac6a4-57')
Sep 30 07:31:27 compute-0 nova_compute[189265]: 2025-09-30 07:31:27.904 2 DEBUG nova.virt.libvirt.driver [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Sep 30 07:31:27 compute-0 nova_compute[189265]: 2025-09-30 07:31:27.904 2 DEBUG nova.compute.manager [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpk2r093zv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='8b5264fb-4374-456c-aa18-276d431aa425',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Sep 30 07:31:27 compute-0 nova_compute[189265]: 2025-09-30 07:31:27.905 2 WARNING neutronclient.v2_0.client [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:31:28 compute-0 nova_compute[189265]: 2025-09-30 07:31:28.500 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:28 compute-0 nova_compute[189265]: 2025-09-30 07:31:28.853 2 WARNING neutronclient.v2_0.client [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:31:29 compute-0 podman[199733]: time="2025-09-30T07:31:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:31:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:31:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20751 "" "Go-http-client/1.1"
Sep 30 07:31:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:31:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3474 "" "Go-http-client/1.1"
Sep 30 07:31:30 compute-0 ovn_controller[91436]: 2025-09-30T07:31:30Z|00165|memory_trim|INFO|Detected inactivity (last active 30000 ms ago): trimming memory
Sep 30 07:31:30 compute-0 podman[219192]: 2025-09-30 07:31:30.490692618 +0000 UTC m=+0.072555157 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 07:31:30 compute-0 nova_compute[189265]: 2025-09-30 07:31:30.802 2 DEBUG nova.network.neutron [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 8b5264fb-4374-456c-aa18-276d431aa425] Port f3eac6a4-578b-4544-b899-b34007452c34 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Sep 30 07:31:30 compute-0 nova_compute[189265]: 2025-09-30 07:31:30.828 2 DEBUG nova.compute.manager [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpk2r093zv',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='8b5264fb-4374-456c-aa18-276d431aa425',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Sep 30 07:31:31 compute-0 openstack_network_exporter[201859]: ERROR   07:31:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:31:31 compute-0 openstack_network_exporter[201859]: ERROR   07:31:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:31:31 compute-0 openstack_network_exporter[201859]: ERROR   07:31:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:31:31 compute-0 openstack_network_exporter[201859]: ERROR   07:31:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:31:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:31:31 compute-0 openstack_network_exporter[201859]: ERROR   07:31:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:31:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:31:32 compute-0 nova_compute[189265]: 2025-09-30 07:31:32.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:33 compute-0 systemd[1]: Starting libvirt proxy daemon...
Sep 30 07:31:33 compute-0 systemd[1]: Started libvirt proxy daemon.
Sep 30 07:31:33 compute-0 NetworkManager[51813]: <info>  [1759217493.2643] manager: (tapf3eac6a4-57): new Tun device (/org/freedesktop/NetworkManager/Devices/62)
Sep 30 07:31:33 compute-0 kernel: tapf3eac6a4-57: entered promiscuous mode
Sep 30 07:31:33 compute-0 ovn_controller[91436]: 2025-09-30T07:31:33Z|00166|binding|INFO|Claiming lport f3eac6a4-578b-4544-b899-b34007452c34 for this additional chassis.
Sep 30 07:31:33 compute-0 ovn_controller[91436]: 2025-09-30T07:31:33Z|00167|binding|INFO|f3eac6a4-578b-4544-b899-b34007452c34: Claiming fa:16:3e:2e:65:44 10.100.0.7
Sep 30 07:31:33 compute-0 nova_compute[189265]: 2025-09-30 07:31:33.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:33.278 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2e:65:44 10.100.0.7'], port_security=['fa:16:3e:2e:65:44 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '8b5264fb-4374-456c-aa18-276d431aa425', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c99c822b-3191-49e5-b938-903e25b4a9bb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6431607f3dce4c88bbf6d17ee6cd45b2', 'neutron:revision_number': '10', 'neutron:security_group_ids': '39e9818d-6ede-4a3d-b6e2-a5ad3a4c803a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0bbcb02d-e040-4e0e-9a60-6466c4420133, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=f3eac6a4-578b-4544-b899-b34007452c34) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:31:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:33.279 100322 INFO neutron.agent.ovn.metadata.agent [-] Port f3eac6a4-578b-4544-b899-b34007452c34 in datapath c99c822b-3191-49e5-b938-903e25b4a9bb unbound from our chassis
Sep 30 07:31:33 compute-0 ovn_controller[91436]: 2025-09-30T07:31:33Z|00168|binding|INFO|Setting lport f3eac6a4-578b-4544-b899-b34007452c34 ovn-installed in OVS
Sep 30 07:31:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:33.280 100322 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c99c822b-3191-49e5-b938-903e25b4a9bb
Sep 30 07:31:33 compute-0 nova_compute[189265]: 2025-09-30 07:31:33.280 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:33 compute-0 nova_compute[189265]: 2025-09-30 07:31:33.281 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:33.294 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[c02066e5-c63f-46e0-87ba-b2eb5677116c]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:31:33 compute-0 systemd-machined[149233]: New machine qemu-13-instance-00000010.
Sep 30 07:31:33 compute-0 systemd-udevd[219249]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 07:31:33 compute-0 NetworkManager[51813]: <info>  [1759217493.3116] device (tapf3eac6a4-57): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 07:31:33 compute-0 NetworkManager[51813]: <info>  [1759217493.3126] device (tapf3eac6a4-57): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 07:31:33 compute-0 systemd[1]: Started Virtual Machine qemu-13-instance-00000010.
Sep 30 07:31:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:33.322 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[d1040f30-0ff9-4f6e-a88c-6f45cdbfe2fc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:31:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:33.324 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[e603f205-3d36-48de-b439-7ece69d3f650]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:31:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:33.349 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[d7acce60-591f-4fde-8c6c-13d224a72656]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:31:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:33.363 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[573575a4-dc6a-47a4-9ef0-be8b83caf906]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc99c822b-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:67:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 531795, 'reachable_time': 33158, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219261, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:31:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:33.380 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[8fedac98-1171-4789-b29e-970eb0dd0daf]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapc99c822b-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 531806, 'tstamp': 531806}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219263, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapc99c822b-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 531809, 'tstamp': 531809}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219263, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:31:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:33.382 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc99c822b-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:31:33 compute-0 nova_compute[189265]: 2025-09-30 07:31:33.383 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:33 compute-0 nova_compute[189265]: 2025-09-30 07:31:33.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:33.384 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc99c822b-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:31:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:33.385 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:31:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:33.385 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc99c822b-30, col_values=(('external_ids', {'iface-id': '67b7df48-3f38-444a-8506-1c0ec5bd1d15'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:31:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:33.385 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:31:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:33.386 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[9fae712e-0bf5-41dd-9ed0-5c5c2b00b99d]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-c99c822b-3191-49e5-b938-903e25b4a9bb\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID c99c822b-3191-49e5-b938-903e25b4a9bb\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:31:33 compute-0 nova_compute[189265]: 2025-09-30 07:31:33.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:36 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:36.294 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '1a:26:7c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2e:60:fa:91:d0:34'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:31:36 compute-0 nova_compute[189265]: 2025-09-30 07:31:36.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:36 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:36.295 100322 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 07:31:36 compute-0 ovn_controller[91436]: 2025-09-30T07:31:36Z|00169|binding|INFO|Claiming lport f3eac6a4-578b-4544-b899-b34007452c34 for this chassis.
Sep 30 07:31:36 compute-0 ovn_controller[91436]: 2025-09-30T07:31:36Z|00170|binding|INFO|f3eac6a4-578b-4544-b899-b34007452c34: Claiming fa:16:3e:2e:65:44 10.100.0.7
Sep 30 07:31:36 compute-0 ovn_controller[91436]: 2025-09-30T07:31:36Z|00171|binding|INFO|Setting lport f3eac6a4-578b-4544-b899-b34007452c34 up in Southbound
Sep 30 07:31:37 compute-0 nova_compute[189265]: 2025-09-30 07:31:37.429 2 INFO nova.compute.manager [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 8b5264fb-4374-456c-aa18-276d431aa425] Post operation of migration started
Sep 30 07:31:37 compute-0 nova_compute[189265]: 2025-09-30 07:31:37.430 2 WARNING neutronclient.v2_0.client [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:31:37 compute-0 nova_compute[189265]: 2025-09-30 07:31:37.529 2 WARNING neutronclient.v2_0.client [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:31:37 compute-0 nova_compute[189265]: 2025-09-30 07:31:37.530 2 WARNING neutronclient.v2_0.client [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:31:37 compute-0 nova_compute[189265]: 2025-09-30 07:31:37.608 2 DEBUG oslo_concurrency.lockutils [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "refresh_cache-8b5264fb-4374-456c-aa18-276d431aa425" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:31:37 compute-0 nova_compute[189265]: 2025-09-30 07:31:37.609 2 DEBUG oslo_concurrency.lockutils [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquired lock "refresh_cache-8b5264fb-4374-456c-aa18-276d431aa425" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:31:37 compute-0 nova_compute[189265]: 2025-09-30 07:31:37.609 2 DEBUG nova.network.neutron [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 8b5264fb-4374-456c-aa18-276d431aa425] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 07:31:37 compute-0 nova_compute[189265]: 2025-09-30 07:31:37.927 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:38 compute-0 nova_compute[189265]: 2025-09-30 07:31:38.120 2 WARNING neutronclient.v2_0.client [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:31:38 compute-0 nova_compute[189265]: 2025-09-30 07:31:38.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:38 compute-0 nova_compute[189265]: 2025-09-30 07:31:38.914 2 WARNING neutronclient.v2_0.client [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:31:39 compute-0 nova_compute[189265]: 2025-09-30 07:31:39.119 2 DEBUG nova.network.neutron [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 8b5264fb-4374-456c-aa18-276d431aa425] Updating instance_info_cache with network_info: [{"id": "f3eac6a4-578b-4544-b899-b34007452c34", "address": "fa:16:3e:2e:65:44", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3eac6a4-57", "ovs_interfaceid": "f3eac6a4-578b-4544-b899-b34007452c34", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:31:39 compute-0 podman[219285]: 2025-09-30 07:31:39.535044778 +0000 UTC m=+0.101137537 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Sep 30 07:31:39 compute-0 nova_compute[189265]: 2025-09-30 07:31:39.629 2 DEBUG oslo_concurrency.lockutils [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Releasing lock "refresh_cache-8b5264fb-4374-456c-aa18-276d431aa425" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:31:40 compute-0 nova_compute[189265]: 2025-09-30 07:31:40.152 2 DEBUG oslo_concurrency.lockutils [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:31:40 compute-0 nova_compute[189265]: 2025-09-30 07:31:40.153 2 DEBUG oslo_concurrency.lockutils [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:31:40 compute-0 nova_compute[189265]: 2025-09-30 07:31:40.154 2 DEBUG oslo_concurrency.lockutils [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:31:40 compute-0 nova_compute[189265]: 2025-09-30 07:31:40.159 2 INFO nova.virt.libvirt.driver [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 8b5264fb-4374-456c-aa18-276d431aa425] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Sep 30 07:31:40 compute-0 virtqemud[189090]: Domain id=13 name='instance-00000010' uuid=8b5264fb-4374-456c-aa18-276d431aa425 is tainted: custom-monitor
Sep 30 07:31:41 compute-0 nova_compute[189265]: 2025-09-30 07:31:41.166 2 INFO nova.virt.libvirt.driver [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 8b5264fb-4374-456c-aa18-276d431aa425] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Sep 30 07:31:41 compute-0 nova_compute[189265]: 2025-09-30 07:31:41.783 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:31:42 compute-0 nova_compute[189265]: 2025-09-30 07:31:42.174 2 INFO nova.virt.libvirt.driver [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 8b5264fb-4374-456c-aa18-276d431aa425] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Sep 30 07:31:42 compute-0 nova_compute[189265]: 2025-09-30 07:31:42.181 2 DEBUG nova.compute.manager [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 8b5264fb-4374-456c-aa18-276d431aa425] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 07:31:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:42.297 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=01429670-4ea1-4dab-babc-4bc628cc01bb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:31:42 compute-0 podman[219306]: 2025-09-30 07:31:42.519490232 +0000 UTC m=+0.095962410 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, config_id=edpm, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Sep 30 07:31:42 compute-0 nova_compute[189265]: 2025-09-30 07:31:42.694 2 DEBUG nova.objects.instance [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 8b5264fb-4374-456c-aa18-276d431aa425] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Sep 30 07:31:42 compute-0 nova_compute[189265]: 2025-09-30 07:31:42.964 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:43 compute-0 nova_compute[189265]: 2025-09-30 07:31:43.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:43 compute-0 nova_compute[189265]: 2025-09-30 07:31:43.714 2 WARNING neutronclient.v2_0.client [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:31:43 compute-0 nova_compute[189265]: 2025-09-30 07:31:43.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:31:44 compute-0 nova_compute[189265]: 2025-09-30 07:31:44.268 2 WARNING neutronclient.v2_0.client [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:31:44 compute-0 nova_compute[189265]: 2025-09-30 07:31:44.269 2 WARNING neutronclient.v2_0.client [None req-7c265cb6-7db6-4e09-bad8-6dcb53653dde e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:31:44 compute-0 nova_compute[189265]: 2025-09-30 07:31:44.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:31:44 compute-0 nova_compute[189265]: 2025-09-30 07:31:44.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:31:44 compute-0 nova_compute[189265]: 2025-09-30 07:31:44.788 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:31:46 compute-0 podman[219328]: 2025-09-30 07:31:46.511237038 +0000 UTC m=+0.086691788 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 07:31:46 compute-0 podman[219329]: 2025-09-30 07:31:46.539167529 +0000 UTC m=+0.110744289 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Sep 30 07:31:46 compute-0 podman[219330]: 2025-09-30 07:31:46.612904508 +0000 UTC m=+0.175950166 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20250930)
Sep 30 07:31:47 compute-0 nova_compute[189265]: 2025-09-30 07:31:47.966 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:48 compute-0 nova_compute[189265]: 2025-09-30 07:31:48.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:48 compute-0 nova_compute[189265]: 2025-09-30 07:31:48.573 2 DEBUG oslo_concurrency.lockutils [None req-cd5ebe51-376c-4db8-bb14-df55dcbc8a3f 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquiring lock "a6ffd09b-ce40-4418-87d0-5555a8f04f67" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:31:48 compute-0 nova_compute[189265]: 2025-09-30 07:31:48.574 2 DEBUG oslo_concurrency.lockutils [None req-cd5ebe51-376c-4db8-bb14-df55dcbc8a3f 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "a6ffd09b-ce40-4418-87d0-5555a8f04f67" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:31:48 compute-0 nova_compute[189265]: 2025-09-30 07:31:48.574 2 DEBUG oslo_concurrency.lockutils [None req-cd5ebe51-376c-4db8-bb14-df55dcbc8a3f 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquiring lock "a6ffd09b-ce40-4418-87d0-5555a8f04f67-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:31:48 compute-0 nova_compute[189265]: 2025-09-30 07:31:48.574 2 DEBUG oslo_concurrency.lockutils [None req-cd5ebe51-376c-4db8-bb14-df55dcbc8a3f 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "a6ffd09b-ce40-4418-87d0-5555a8f04f67-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:31:48 compute-0 nova_compute[189265]: 2025-09-30 07:31:48.574 2 DEBUG oslo_concurrency.lockutils [None req-cd5ebe51-376c-4db8-bb14-df55dcbc8a3f 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "a6ffd09b-ce40-4418-87d0-5555a8f04f67-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:31:48 compute-0 nova_compute[189265]: 2025-09-30 07:31:48.588 2 INFO nova.compute.manager [None req-cd5ebe51-376c-4db8-bb14-df55dcbc8a3f 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Terminating instance
Sep 30 07:31:49 compute-0 nova_compute[189265]: 2025-09-30 07:31:49.106 2 DEBUG nova.compute.manager [None req-cd5ebe51-376c-4db8-bb14-df55dcbc8a3f 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 07:31:49 compute-0 kernel: tap5406a6ea-40 (unregistering): left promiscuous mode
Sep 30 07:31:49 compute-0 NetworkManager[51813]: <info>  [1759217509.1388] device (tap5406a6ea-40): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 07:31:49 compute-0 nova_compute[189265]: 2025-09-30 07:31:49.150 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:49 compute-0 ovn_controller[91436]: 2025-09-30T07:31:49Z|00172|binding|INFO|Releasing lport 5406a6ea-407c-46b1-b791-a508e191918e from this chassis (sb_readonly=0)
Sep 30 07:31:49 compute-0 ovn_controller[91436]: 2025-09-30T07:31:49Z|00173|binding|INFO|Setting lport 5406a6ea-407c-46b1-b791-a508e191918e down in Southbound
Sep 30 07:31:49 compute-0 ovn_controller[91436]: 2025-09-30T07:31:49Z|00174|binding|INFO|Removing iface tap5406a6ea-40 ovn-installed in OVS
Sep 30 07:31:49 compute-0 nova_compute[189265]: 2025-09-30 07:31:49.153 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:49 compute-0 nova_compute[189265]: 2025-09-30 07:31:49.172 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:49 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:49.174 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0f:9f:77 10.100.0.8'], port_security=['fa:16:3e:0f:9f:77 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': 'a6ffd09b-ce40-4418-87d0-5555a8f04f67', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c99c822b-3191-49e5-b938-903e25b4a9bb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6431607f3dce4c88bbf6d17ee6cd45b2', 'neutron:revision_number': '5', 'neutron:security_group_ids': '39e9818d-6ede-4a3d-b6e2-a5ad3a4c803a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0bbcb02d-e040-4e0e-9a60-6466c4420133, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], logical_port=5406a6ea-407c-46b1-b791-a508e191918e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:31:49 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:49.175 100322 INFO neutron.agent.ovn.metadata.agent [-] Port 5406a6ea-407c-46b1-b791-a508e191918e in datapath c99c822b-3191-49e5-b938-903e25b4a9bb unbound from our chassis
Sep 30 07:31:49 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:49.177 100322 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c99c822b-3191-49e5-b938-903e25b4a9bb
Sep 30 07:31:49 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000011.scope: Deactivated successfully.
Sep 30 07:31:49 compute-0 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000011.scope: Consumed 13.376s CPU time.
Sep 30 07:31:49 compute-0 systemd-machined[149233]: Machine qemu-12-instance-00000011 terminated.
Sep 30 07:31:49 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:49.209 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[22ac4514-16f6-4ca3-b9d6-c7e05b97e4b4]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:31:49 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:49.250 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[e3ebbbdf-2c98-4f95-9f82-bc34a422a032]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:31:49 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:49.254 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[c7028e0b-fed6-46b8-b3a9-46c8ed1df16a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:31:49 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:49.295 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[1fc2f418-86c6-4965-a981-38f9f5c5eab6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:31:49 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:49.323 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[7150ae18-5a73-441d-8b5f-be2052d0bb28]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc99c822b-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:67:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 531795, 'reachable_time': 33158, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219402, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:31:49 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:49.351 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[65cf13f2-c607-474c-b1b3-441fcd54f452]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapc99c822b-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 531806, 'tstamp': 531806}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219407, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapc99c822b-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 531809, 'tstamp': 531809}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219407, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:31:49 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:49.353 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc99c822b-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:31:49 compute-0 nova_compute[189265]: 2025-09-30 07:31:49.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:49 compute-0 nova_compute[189265]: 2025-09-30 07:31:49.359 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:49 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:49.360 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc99c822b-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:31:49 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:49.361 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:31:49 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:49.361 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc99c822b-30, col_values=(('external_ids', {'iface-id': '67b7df48-3f38-444a-8506-1c0ec5bd1d15'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:31:49 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:49.362 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:31:49 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:49.364 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[17516b1d-3d0e-437f-b9cb-2ff17baa6807]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-c99c822b-3191-49e5-b938-903e25b4a9bb\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID c99c822b-3191-49e5-b938-903e25b4a9bb\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:31:49 compute-0 nova_compute[189265]: 2025-09-30 07:31:49.368 2 DEBUG nova.compute.manager [req-10dc12d6-a08e-4a83-86cd-f2ee7311a380 req-2ceb8b08-70b1-40b4-a00c-0dcdceab89c7 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Received event network-vif-unplugged-5406a6ea-407c-46b1-b791-a508e191918e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:31:49 compute-0 nova_compute[189265]: 2025-09-30 07:31:49.369 2 DEBUG oslo_concurrency.lockutils [req-10dc12d6-a08e-4a83-86cd-f2ee7311a380 req-2ceb8b08-70b1-40b4-a00c-0dcdceab89c7 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "a6ffd09b-ce40-4418-87d0-5555a8f04f67-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:31:49 compute-0 nova_compute[189265]: 2025-09-30 07:31:49.369 2 DEBUG oslo_concurrency.lockutils [req-10dc12d6-a08e-4a83-86cd-f2ee7311a380 req-2ceb8b08-70b1-40b4-a00c-0dcdceab89c7 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "a6ffd09b-ce40-4418-87d0-5555a8f04f67-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:31:49 compute-0 nova_compute[189265]: 2025-09-30 07:31:49.370 2 DEBUG oslo_concurrency.lockutils [req-10dc12d6-a08e-4a83-86cd-f2ee7311a380 req-2ceb8b08-70b1-40b4-a00c-0dcdceab89c7 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "a6ffd09b-ce40-4418-87d0-5555a8f04f67-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:31:49 compute-0 nova_compute[189265]: 2025-09-30 07:31:49.370 2 DEBUG nova.compute.manager [req-10dc12d6-a08e-4a83-86cd-f2ee7311a380 req-2ceb8b08-70b1-40b4-a00c-0dcdceab89c7 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] No waiting events found dispatching network-vif-unplugged-5406a6ea-407c-46b1-b791-a508e191918e pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:31:49 compute-0 nova_compute[189265]: 2025-09-30 07:31:49.371 2 DEBUG nova.compute.manager [req-10dc12d6-a08e-4a83-86cd-f2ee7311a380 req-2ceb8b08-70b1-40b4-a00c-0dcdceab89c7 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Received event network-vif-unplugged-5406a6ea-407c-46b1-b791-a508e191918e for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:31:49 compute-0 nova_compute[189265]: 2025-09-30 07:31:49.390 2 INFO nova.virt.libvirt.driver [-] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Instance destroyed successfully.
Sep 30 07:31:49 compute-0 nova_compute[189265]: 2025-09-30 07:31:49.391 2 DEBUG nova.objects.instance [None req-cd5ebe51-376c-4db8-bb14-df55dcbc8a3f 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lazy-loading 'resources' on Instance uuid a6ffd09b-ce40-4418-87d0-5555a8f04f67 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:31:49 compute-0 nova_compute[189265]: 2025-09-30 07:31:49.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:31:49 compute-0 nova_compute[189265]: 2025-09-30 07:31:49.946 2 DEBUG nova.virt.libvirt.vif [None req-cd5ebe51-376c-4db8-bb14-df55dcbc8a3f 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-09-30T07:30:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1986949610',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1986949610',id=17,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T07:31:01Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6431607f3dce4c88bbf6d17ee6cd45b2',ramdisk_id='',reservation_id='r-6sh2c7j8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1096120513',owner_user_name='tempest-TestExecuteStrategies-1096120513-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T07:31:01Z,user_data=None,user_id='89ba5d19014145188ad2a3c812acdc88',uuid=a6ffd09b-ce40-4418-87d0-5555a8f04f67,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5406a6ea-407c-46b1-b791-a508e191918e", "address": "fa:16:3e:0f:9f:77", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5406a6ea-40", "ovs_interfaceid": "5406a6ea-407c-46b1-b791-a508e191918e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 07:31:49 compute-0 nova_compute[189265]: 2025-09-30 07:31:49.947 2 DEBUG nova.network.os_vif_util [None req-cd5ebe51-376c-4db8-bb14-df55dcbc8a3f 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Converting VIF {"id": "5406a6ea-407c-46b1-b791-a508e191918e", "address": "fa:16:3e:0f:9f:77", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5406a6ea-40", "ovs_interfaceid": "5406a6ea-407c-46b1-b791-a508e191918e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:31:49 compute-0 nova_compute[189265]: 2025-09-30 07:31:49.948 2 DEBUG nova.network.os_vif_util [None req-cd5ebe51-376c-4db8-bb14-df55dcbc8a3f 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0f:9f:77,bridge_name='br-int',has_traffic_filtering=True,id=5406a6ea-407c-46b1-b791-a508e191918e,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5406a6ea-40') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:31:49 compute-0 nova_compute[189265]: 2025-09-30 07:31:49.949 2 DEBUG os_vif [None req-cd5ebe51-376c-4db8-bb14-df55dcbc8a3f 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0f:9f:77,bridge_name='br-int',has_traffic_filtering=True,id=5406a6ea-407c-46b1-b791-a508e191918e,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5406a6ea-40') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 07:31:49 compute-0 nova_compute[189265]: 2025-09-30 07:31:49.951 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:49 compute-0 nova_compute[189265]: 2025-09-30 07:31:49.952 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5406a6ea-40, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:31:50 compute-0 nova_compute[189265]: 2025-09-30 07:31:50.006 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:50 compute-0 nova_compute[189265]: 2025-09-30 07:31:50.008 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:50 compute-0 nova_compute[189265]: 2025-09-30 07:31:50.010 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:50 compute-0 nova_compute[189265]: 2025-09-30 07:31:50.010 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=087ad3a2-0c35-4ece-8dde-3b866f861316) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:31:50 compute-0 nova_compute[189265]: 2025-09-30 07:31:50.012 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:50 compute-0 nova_compute[189265]: 2025-09-30 07:31:50.013 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:50 compute-0 nova_compute[189265]: 2025-09-30 07:31:50.015 2 INFO os_vif [None req-cd5ebe51-376c-4db8-bb14-df55dcbc8a3f 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0f:9f:77,bridge_name='br-int',has_traffic_filtering=True,id=5406a6ea-407c-46b1-b791-a508e191918e,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5406a6ea-40')
Sep 30 07:31:50 compute-0 nova_compute[189265]: 2025-09-30 07:31:50.016 2 INFO nova.virt.libvirt.driver [None req-cd5ebe51-376c-4db8-bb14-df55dcbc8a3f 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Deleting instance files /var/lib/nova/instances/a6ffd09b-ce40-4418-87d0-5555a8f04f67_del
Sep 30 07:31:50 compute-0 nova_compute[189265]: 2025-09-30 07:31:50.018 2 INFO nova.virt.libvirt.driver [None req-cd5ebe51-376c-4db8-bb14-df55dcbc8a3f 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Deletion of /var/lib/nova/instances/a6ffd09b-ce40-4418-87d0-5555a8f04f67_del complete
Sep 30 07:31:50 compute-0 nova_compute[189265]: 2025-09-30 07:31:50.299 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:31:50 compute-0 nova_compute[189265]: 2025-09-30 07:31:50.300 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:31:50 compute-0 nova_compute[189265]: 2025-09-30 07:31:50.300 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:31:50 compute-0 nova_compute[189265]: 2025-09-30 07:31:50.301 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:31:50 compute-0 nova_compute[189265]: 2025-09-30 07:31:50.536 2 INFO nova.compute.manager [None req-cd5ebe51-376c-4db8-bb14-df55dcbc8a3f 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Took 1.43 seconds to destroy the instance on the hypervisor.
Sep 30 07:31:50 compute-0 nova_compute[189265]: 2025-09-30 07:31:50.536 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-cd5ebe51-376c-4db8-bb14-df55dcbc8a3f 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 07:31:50 compute-0 nova_compute[189265]: 2025-09-30 07:31:50.537 2 DEBUG nova.compute.manager [-] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 07:31:50 compute-0 nova_compute[189265]: 2025-09-30 07:31:50.537 2 DEBUG nova.network.neutron [-] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 07:31:50 compute-0 nova_compute[189265]: 2025-09-30 07:31:50.538 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:31:50 compute-0 nova_compute[189265]: 2025-09-30 07:31:50.740 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:31:51 compute-0 nova_compute[189265]: 2025-09-30 07:31:51.349 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8b5264fb-4374-456c-aa18-276d431aa425/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:31:51 compute-0 nova_compute[189265]: 2025-09-30 07:31:51.435 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8b5264fb-4374-456c-aa18-276d431aa425/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:31:51 compute-0 nova_compute[189265]: 2025-09-30 07:31:51.435 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8b5264fb-4374-456c-aa18-276d431aa425/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:31:51 compute-0 nova_compute[189265]: 2025-09-30 07:31:51.470 2 DEBUG nova.compute.manager [req-08214446-8036-4a17-be61-3dd1b3810e2c req-3437d622-3a23-40b7-ae36-fec4390f4aa6 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Received event network-vif-unplugged-5406a6ea-407c-46b1-b791-a508e191918e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:31:51 compute-0 nova_compute[189265]: 2025-09-30 07:31:51.471 2 DEBUG oslo_concurrency.lockutils [req-08214446-8036-4a17-be61-3dd1b3810e2c req-3437d622-3a23-40b7-ae36-fec4390f4aa6 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "a6ffd09b-ce40-4418-87d0-5555a8f04f67-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:31:51 compute-0 nova_compute[189265]: 2025-09-30 07:31:51.471 2 DEBUG oslo_concurrency.lockutils [req-08214446-8036-4a17-be61-3dd1b3810e2c req-3437d622-3a23-40b7-ae36-fec4390f4aa6 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "a6ffd09b-ce40-4418-87d0-5555a8f04f67-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:31:51 compute-0 nova_compute[189265]: 2025-09-30 07:31:51.471 2 DEBUG oslo_concurrency.lockutils [req-08214446-8036-4a17-be61-3dd1b3810e2c req-3437d622-3a23-40b7-ae36-fec4390f4aa6 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "a6ffd09b-ce40-4418-87d0-5555a8f04f67-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:31:51 compute-0 nova_compute[189265]: 2025-09-30 07:31:51.471 2 DEBUG nova.compute.manager [req-08214446-8036-4a17-be61-3dd1b3810e2c req-3437d622-3a23-40b7-ae36-fec4390f4aa6 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] No waiting events found dispatching network-vif-unplugged-5406a6ea-407c-46b1-b791-a508e191918e pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:31:51 compute-0 nova_compute[189265]: 2025-09-30 07:31:51.471 2 DEBUG nova.compute.manager [req-08214446-8036-4a17-be61-3dd1b3810e2c req-3437d622-3a23-40b7-ae36-fec4390f4aa6 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Received event network-vif-unplugged-5406a6ea-407c-46b1-b791-a508e191918e for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:31:51 compute-0 nova_compute[189265]: 2025-09-30 07:31:51.472 2 DEBUG nova.compute.manager [req-08214446-8036-4a17-be61-3dd1b3810e2c req-3437d622-3a23-40b7-ae36-fec4390f4aa6 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Received event network-vif-deleted-5406a6ea-407c-46b1-b791-a508e191918e external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:31:51 compute-0 nova_compute[189265]: 2025-09-30 07:31:51.472 2 INFO nova.compute.manager [req-08214446-8036-4a17-be61-3dd1b3810e2c req-3437d622-3a23-40b7-ae36-fec4390f4aa6 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Neutron deleted interface 5406a6ea-407c-46b1-b791-a508e191918e; detaching it from the instance and deleting it from the info cache
Sep 30 07:31:51 compute-0 nova_compute[189265]: 2025-09-30 07:31:51.472 2 DEBUG nova.network.neutron [req-08214446-8036-4a17-be61-3dd1b3810e2c req-3437d622-3a23-40b7-ae36-fec4390f4aa6 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:31:51 compute-0 nova_compute[189265]: 2025-09-30 07:31:51.493 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8b5264fb-4374-456c-aa18-276d431aa425/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:31:51 compute-0 nova_compute[189265]: 2025-09-30 07:31:51.495 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Error from libvirt while getting description of instance-00000011: [Error Code 42] Domain not found: no domain with matching uuid 'a6ffd09b-ce40-4418-87d0-5555a8f04f67' (instance-00000011): libvirt.libvirtError: Domain not found: no domain with matching uuid 'a6ffd09b-ce40-4418-87d0-5555a8f04f67' (instance-00000011)
Sep 30 07:31:51 compute-0 nova_compute[189265]: 2025-09-30 07:31:51.520 2 DEBUG nova.network.neutron [-] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:31:51 compute-0 nova_compute[189265]: 2025-09-30 07:31:51.619 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:31:51 compute-0 nova_compute[189265]: 2025-09-30 07:31:51.621 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:31:51 compute-0 nova_compute[189265]: 2025-09-30 07:31:51.656 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.035s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:31:51 compute-0 nova_compute[189265]: 2025-09-30 07:31:51.657 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5670MB free_disk=73.2750473022461GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:31:51 compute-0 nova_compute[189265]: 2025-09-30 07:31:51.657 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:31:51 compute-0 nova_compute[189265]: 2025-09-30 07:31:51.657 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:31:51 compute-0 nova_compute[189265]: 2025-09-30 07:31:51.982 2 DEBUG nova.compute.manager [req-08214446-8036-4a17-be61-3dd1b3810e2c req-3437d622-3a23-40b7-ae36-fec4390f4aa6 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Detach interface failed, port_id=5406a6ea-407c-46b1-b791-a508e191918e, reason: Instance a6ffd09b-ce40-4418-87d0-5555a8f04f67 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Sep 30 07:31:52 compute-0 nova_compute[189265]: 2025-09-30 07:31:52.026 2 INFO nova.compute.manager [-] [instance: a6ffd09b-ce40-4418-87d0-5555a8f04f67] Took 1.49 seconds to deallocate network for instance.
Sep 30 07:31:52 compute-0 nova_compute[189265]: 2025-09-30 07:31:52.548 2 DEBUG oslo_concurrency.lockutils [None req-cd5ebe51-376c-4db8-bb14-df55dcbc8a3f 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:31:53 compute-0 nova_compute[189265]: 2025-09-30 07:31:53.224 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Instance a6ffd09b-ce40-4418-87d0-5555a8f04f67 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 07:31:53 compute-0 nova_compute[189265]: 2025-09-30 07:31:53.225 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Instance 8b5264fb-4374-456c-aa18-276d431aa425 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 07:31:53 compute-0 nova_compute[189265]: 2025-09-30 07:31:53.226 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:31:53 compute-0 nova_compute[189265]: 2025-09-30 07:31:53.226 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:31:51 up  1:29,  0 user,  load average: 0.24, 0.20, 0.28\n', 'num_instances': '2', 'num_vm_active': '2', 'num_task_None': '1', 'num_os_type_None': '2', 'num_proj_6431607f3dce4c88bbf6d17ee6cd45b2': '2', 'io_workload': '0', 'num_task_deleting': '1'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:31:53 compute-0 nova_compute[189265]: 2025-09-30 07:31:53.282 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:31:53 compute-0 nova_compute[189265]: 2025-09-30 07:31:53.557 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:53 compute-0 nova_compute[189265]: 2025-09-30 07:31:53.794 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:31:54 compute-0 nova_compute[189265]: 2025-09-30 07:31:54.306 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:31:54 compute-0 nova_compute[189265]: 2025-09-30 07:31:54.306 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.649s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:31:54 compute-0 nova_compute[189265]: 2025-09-30 07:31:54.307 2 DEBUG oslo_concurrency.lockutils [None req-cd5ebe51-376c-4db8-bb14-df55dcbc8a3f 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 1.759s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:31:54 compute-0 nova_compute[189265]: 2025-09-30 07:31:54.367 2 DEBUG nova.compute.provider_tree [None req-cd5ebe51-376c-4db8-bb14-df55dcbc8a3f 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:31:54 compute-0 nova_compute[189265]: 2025-09-30 07:31:54.879 2 DEBUG nova.scheduler.client.report [None req-cd5ebe51-376c-4db8-bb14-df55dcbc8a3f 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:31:55 compute-0 nova_compute[189265]: 2025-09-30 07:31:55.013 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:55 compute-0 nova_compute[189265]: 2025-09-30 07:31:55.307 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:31:55 compute-0 nova_compute[189265]: 2025-09-30 07:31:55.397 2 DEBUG oslo_concurrency.lockutils [None req-cd5ebe51-376c-4db8-bb14-df55dcbc8a3f 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.090s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:31:55 compute-0 nova_compute[189265]: 2025-09-30 07:31:55.421 2 INFO nova.scheduler.client.report [None req-cd5ebe51-376c-4db8-bb14-df55dcbc8a3f 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Deleted allocations for instance a6ffd09b-ce40-4418-87d0-5555a8f04f67
Sep 30 07:31:55 compute-0 nova_compute[189265]: 2025-09-30 07:31:55.822 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:31:55 compute-0 nova_compute[189265]: 2025-09-30 07:31:55.822 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:31:56 compute-0 nova_compute[189265]: 2025-09-30 07:31:56.457 2 DEBUG oslo_concurrency.lockutils [None req-cd5ebe51-376c-4db8-bb14-df55dcbc8a3f 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "a6ffd09b-ce40-4418-87d0-5555a8f04f67" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.884s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:31:57 compute-0 nova_compute[189265]: 2025-09-30 07:31:57.311 2 DEBUG oslo_concurrency.lockutils [None req-c2010f80-d654-4a0a-a12b-83b4174f8103 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquiring lock "8b5264fb-4374-456c-aa18-276d431aa425" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:31:57 compute-0 nova_compute[189265]: 2025-09-30 07:31:57.312 2 DEBUG oslo_concurrency.lockutils [None req-c2010f80-d654-4a0a-a12b-83b4174f8103 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "8b5264fb-4374-456c-aa18-276d431aa425" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:31:57 compute-0 nova_compute[189265]: 2025-09-30 07:31:57.313 2 DEBUG oslo_concurrency.lockutils [None req-c2010f80-d654-4a0a-a12b-83b4174f8103 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquiring lock "8b5264fb-4374-456c-aa18-276d431aa425-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:31:57 compute-0 nova_compute[189265]: 2025-09-30 07:31:57.313 2 DEBUG oslo_concurrency.lockutils [None req-c2010f80-d654-4a0a-a12b-83b4174f8103 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "8b5264fb-4374-456c-aa18-276d431aa425-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:31:57 compute-0 nova_compute[189265]: 2025-09-30 07:31:57.314 2 DEBUG oslo_concurrency.lockutils [None req-c2010f80-d654-4a0a-a12b-83b4174f8103 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "8b5264fb-4374-456c-aa18-276d431aa425-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:31:57 compute-0 nova_compute[189265]: 2025-09-30 07:31:57.330 2 INFO nova.compute.manager [None req-c2010f80-d654-4a0a-a12b-83b4174f8103 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 8b5264fb-4374-456c-aa18-276d431aa425] Terminating instance
Sep 30 07:31:57 compute-0 nova_compute[189265]: 2025-09-30 07:31:57.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:31:57 compute-0 nova_compute[189265]: 2025-09-30 07:31:57.846 2 DEBUG nova.compute.manager [None req-c2010f80-d654-4a0a-a12b-83b4174f8103 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 8b5264fb-4374-456c-aa18-276d431aa425] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 07:31:57 compute-0 kernel: tapf3eac6a4-57 (unregistering): left promiscuous mode
Sep 30 07:31:57 compute-0 NetworkManager[51813]: <info>  [1759217517.8816] device (tapf3eac6a4-57): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 07:31:57 compute-0 ovn_controller[91436]: 2025-09-30T07:31:57Z|00175|binding|INFO|Releasing lport f3eac6a4-578b-4544-b899-b34007452c34 from this chassis (sb_readonly=0)
Sep 30 07:31:57 compute-0 ovn_controller[91436]: 2025-09-30T07:31:57Z|00176|binding|INFO|Setting lport f3eac6a4-578b-4544-b899-b34007452c34 down in Southbound
Sep 30 07:31:57 compute-0 ovn_controller[91436]: 2025-09-30T07:31:57Z|00177|binding|INFO|Removing iface tapf3eac6a4-57 ovn-installed in OVS
Sep 30 07:31:57 compute-0 nova_compute[189265]: 2025-09-30 07:31:57.937 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:57 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:57.947 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2e:65:44 10.100.0.7'], port_security=['fa:16:3e:2e:65:44 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '8b5264fb-4374-456c-aa18-276d431aa425', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c99c822b-3191-49e5-b938-903e25b4a9bb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6431607f3dce4c88bbf6d17ee6cd45b2', 'neutron:revision_number': '15', 'neutron:security_group_ids': '39e9818d-6ede-4a3d-b6e2-a5ad3a4c803a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0bbcb02d-e040-4e0e-9a60-6466c4420133, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], logical_port=f3eac6a4-578b-4544-b899-b34007452c34) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:31:57 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:57.948 100322 INFO neutron.agent.ovn.metadata.agent [-] Port f3eac6a4-578b-4544-b899-b34007452c34 in datapath c99c822b-3191-49e5-b938-903e25b4a9bb unbound from our chassis
Sep 30 07:31:57 compute-0 nova_compute[189265]: 2025-09-30 07:31:57.949 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:57 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:57.950 100322 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c99c822b-3191-49e5-b938-903e25b4a9bb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 07:31:57 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:57.951 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[893136df-213d-4a88-bced-52a1f4943fea]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:31:57 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:57.951 100322 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb namespace which is not needed anymore
Sep 30 07:31:57 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000010.scope: Deactivated successfully.
Sep 30 07:31:57 compute-0 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000010.scope: Consumed 2.962s CPU time.
Sep 30 07:31:58 compute-0 systemd-machined[149233]: Machine qemu-13-instance-00000010 terminated.
Sep 30 07:31:58 compute-0 nova_compute[189265]: 2025-09-30 07:31:58.072 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:58 compute-0 nova_compute[189265]: 2025-09-30 07:31:58.083 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:58 compute-0 neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb[219026]: [NOTICE]   (219031) : haproxy version is 3.0.5-8e879a5
Sep 30 07:31:58 compute-0 neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb[219026]: [NOTICE]   (219031) : path to executable is /usr/sbin/haproxy
Sep 30 07:31:58 compute-0 neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb[219026]: [WARNING]  (219031) : Exiting Master process...
Sep 30 07:31:58 compute-0 neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb[219026]: [ALERT]    (219031) : Current worker (219033) exited with code 143 (Terminated)
Sep 30 07:31:58 compute-0 neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb[219026]: [WARNING]  (219031) : All workers exited. Exiting... (0)
Sep 30 07:31:58 compute-0 podman[219456]: 2025-09-30 07:31:58.122683676 +0000 UTC m=+0.040804317 container kill 7ba814fe689129cd63077f59b73aba0bf7f44e7c874b8e5102721e0e554267f8 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Sep 30 07:31:58 compute-0 systemd[1]: libpod-7ba814fe689129cd63077f59b73aba0bf7f44e7c874b8e5102721e0e554267f8.scope: Deactivated successfully.
Sep 30 07:31:58 compute-0 nova_compute[189265]: 2025-09-30 07:31:58.129 2 INFO nova.virt.libvirt.driver [-] [instance: 8b5264fb-4374-456c-aa18-276d431aa425] Instance destroyed successfully.
Sep 30 07:31:58 compute-0 nova_compute[189265]: 2025-09-30 07:31:58.130 2 DEBUG nova.objects.instance [None req-c2010f80-d654-4a0a-a12b-83b4174f8103 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lazy-loading 'resources' on Instance uuid 8b5264fb-4374-456c-aa18-276d431aa425 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:31:58 compute-0 nova_compute[189265]: 2025-09-30 07:31:58.136 2 DEBUG nova.compute.manager [req-ee21d61e-2c93-43e1-a880-b71efb4265f3 req-ef349e51-3342-4118-a94d-79ba48fe022f 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 8b5264fb-4374-456c-aa18-276d431aa425] Received event network-vif-unplugged-f3eac6a4-578b-4544-b899-b34007452c34 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:31:58 compute-0 nova_compute[189265]: 2025-09-30 07:31:58.136 2 DEBUG oslo_concurrency.lockutils [req-ee21d61e-2c93-43e1-a880-b71efb4265f3 req-ef349e51-3342-4118-a94d-79ba48fe022f 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "8b5264fb-4374-456c-aa18-276d431aa425-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:31:58 compute-0 nova_compute[189265]: 2025-09-30 07:31:58.136 2 DEBUG oslo_concurrency.lockutils [req-ee21d61e-2c93-43e1-a880-b71efb4265f3 req-ef349e51-3342-4118-a94d-79ba48fe022f 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "8b5264fb-4374-456c-aa18-276d431aa425-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:31:58 compute-0 nova_compute[189265]: 2025-09-30 07:31:58.137 2 DEBUG oslo_concurrency.lockutils [req-ee21d61e-2c93-43e1-a880-b71efb4265f3 req-ef349e51-3342-4118-a94d-79ba48fe022f 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "8b5264fb-4374-456c-aa18-276d431aa425-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:31:58 compute-0 nova_compute[189265]: 2025-09-30 07:31:58.137 2 DEBUG nova.compute.manager [req-ee21d61e-2c93-43e1-a880-b71efb4265f3 req-ef349e51-3342-4118-a94d-79ba48fe022f 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 8b5264fb-4374-456c-aa18-276d431aa425] No waiting events found dispatching network-vif-unplugged-f3eac6a4-578b-4544-b899-b34007452c34 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:31:58 compute-0 nova_compute[189265]: 2025-09-30 07:31:58.137 2 DEBUG nova.compute.manager [req-ee21d61e-2c93-43e1-a880-b71efb4265f3 req-ef349e51-3342-4118-a94d-79ba48fe022f 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 8b5264fb-4374-456c-aa18-276d431aa425] Received event network-vif-unplugged-f3eac6a4-578b-4544-b899-b34007452c34 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:31:58 compute-0 podman[219481]: 2025-09-30 07:31:58.181511923 +0000 UTC m=+0.036783673 container died 7ba814fe689129cd63077f59b73aba0bf7f44e7c874b8e5102721e0e554267f8 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.build-date=20250930)
Sep 30 07:31:58 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7ba814fe689129cd63077f59b73aba0bf7f44e7c874b8e5102721e0e554267f8-userdata-shm.mount: Deactivated successfully.
Sep 30 07:31:58 compute-0 systemd[1]: var-lib-containers-storage-overlay-5779053d06151f4b9c6748a2ecda1c9ff8c0d351cb69fb72302c73565796c6ef-merged.mount: Deactivated successfully.
Sep 30 07:31:58 compute-0 podman[219481]: 2025-09-30 07:31:58.225055287 +0000 UTC m=+0.080326987 container cleanup 7ba814fe689129cd63077f59b73aba0bf7f44e7c874b8e5102721e0e554267f8 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Sep 30 07:31:58 compute-0 systemd[1]: libpod-conmon-7ba814fe689129cd63077f59b73aba0bf7f44e7c874b8e5102721e0e554267f8.scope: Deactivated successfully.
Sep 30 07:31:58 compute-0 podman[219483]: 2025-09-30 07:31:58.241181014 +0000 UTC m=+0.085069971 container remove 7ba814fe689129cd63077f59b73aba0bf7f44e7c874b8e5102721e0e554267f8 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930)
Sep 30 07:31:58 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:58.263 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[9493508c-60cf-403d-a12d-61d412e3c2d8]: (4, ("Tue Sep 30 07:31:58 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb (7ba814fe689129cd63077f59b73aba0bf7f44e7c874b8e5102721e0e554267f8)\n7ba814fe689129cd63077f59b73aba0bf7f44e7c874b8e5102721e0e554267f8\nTue Sep 30 07:31:58 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb (7ba814fe689129cd63077f59b73aba0bf7f44e7c874b8e5102721e0e554267f8)\n7ba814fe689129cd63077f59b73aba0bf7f44e7c874b8e5102721e0e554267f8\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:31:58 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:58.265 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[4af75b90-23a2-421b-b534-12cd618fb373]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:31:58 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:58.265 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:31:58 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:58.266 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[feee413b-96eb-4f5b-993e-3793aea0ed7f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:31:58 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:58.266 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc99c822b-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:31:58 compute-0 nova_compute[189265]: 2025-09-30 07:31:58.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:58 compute-0 kernel: tapc99c822b-30: left promiscuous mode
Sep 30 07:31:58 compute-0 nova_compute[189265]: 2025-09-30 07:31:58.282 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:58 compute-0 nova_compute[189265]: 2025-09-30 07:31:58.284 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:58 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:58.285 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[617d660e-63d1-452e-af06-7f1be79f5214]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:31:58 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:58.310 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[e32c7998-5f4f-477b-a568-40b0efc5a549]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:31:58 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:58.311 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[c1be4829-852c-4314-87be-0697f41e070d]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:31:58 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:58.323 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[359f4a36-b020-4d7d-a83f-03f110b66a1d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 531787, 'reachable_time': 26532, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219514, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:31:58 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:58.325 100440 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Sep 30 07:31:58 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:31:58.325 100440 DEBUG oslo.privsep.daemon [-] privsep: reply[372cae38-54e3-4424-aab7-558bb8f7bcc5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:31:58 compute-0 systemd[1]: run-netns-ovnmeta\x2dc99c822b\x2d3191\x2d49e5\x2db938\x2d903e25b4a9bb.mount: Deactivated successfully.
Sep 30 07:31:58 compute-0 nova_compute[189265]: 2025-09-30 07:31:58.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:58 compute-0 nova_compute[189265]: 2025-09-30 07:31:58.640 2 DEBUG nova.virt.libvirt.vif [None req-c2010f80-d654-4a0a-a12b-83b4174f8103 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-09-30T07:30:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-484805115',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-484805115',id=16,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T07:30:40Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6431607f3dce4c88bbf6d17ee6cd45b2',ramdisk_id='',reservation_id='r-ugqkt2rz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',clean_attempts='1',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1096120513',owner_user_name='tempest-TestExecuteStrategies-1096120513-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T07:31:43Z,user_data=None,user_id='89ba5d19014145188ad2a3c812acdc88',uuid=8b5264fb-4374-456c-aa18-276d431aa425,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f3eac6a4-578b-4544-b899-b34007452c34", "address": "fa:16:3e:2e:65:44", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3eac6a4-57", "ovs_interfaceid": "f3eac6a4-578b-4544-b899-b34007452c34", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 07:31:58 compute-0 nova_compute[189265]: 2025-09-30 07:31:58.640 2 DEBUG nova.network.os_vif_util [None req-c2010f80-d654-4a0a-a12b-83b4174f8103 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Converting VIF {"id": "f3eac6a4-578b-4544-b899-b34007452c34", "address": "fa:16:3e:2e:65:44", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3eac6a4-57", "ovs_interfaceid": "f3eac6a4-578b-4544-b899-b34007452c34", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:31:58 compute-0 nova_compute[189265]: 2025-09-30 07:31:58.642 2 DEBUG nova.network.os_vif_util [None req-c2010f80-d654-4a0a-a12b-83b4174f8103 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2e:65:44,bridge_name='br-int',has_traffic_filtering=True,id=f3eac6a4-578b-4544-b899-b34007452c34,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf3eac6a4-57') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:31:58 compute-0 nova_compute[189265]: 2025-09-30 07:31:58.642 2 DEBUG os_vif [None req-c2010f80-d654-4a0a-a12b-83b4174f8103 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:2e:65:44,bridge_name='br-int',has_traffic_filtering=True,id=f3eac6a4-578b-4544-b899-b34007452c34,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf3eac6a4-57') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 07:31:58 compute-0 nova_compute[189265]: 2025-09-30 07:31:58.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:58 compute-0 nova_compute[189265]: 2025-09-30 07:31:58.645 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf3eac6a4-57, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:31:58 compute-0 nova_compute[189265]: 2025-09-30 07:31:58.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:58 compute-0 nova_compute[189265]: 2025-09-30 07:31:58.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:31:58 compute-0 nova_compute[189265]: 2025-09-30 07:31:58.652 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:58 compute-0 nova_compute[189265]: 2025-09-30 07:31:58.652 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=d5ccc00d-52d1-4ed8-a79f-cf0fd613c6a0) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:31:58 compute-0 nova_compute[189265]: 2025-09-30 07:31:58.653 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:31:58 compute-0 nova_compute[189265]: 2025-09-30 07:31:58.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:31:58 compute-0 nova_compute[189265]: 2025-09-30 07:31:58.659 2 INFO os_vif [None req-c2010f80-d654-4a0a-a12b-83b4174f8103 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:2e:65:44,bridge_name='br-int',has_traffic_filtering=True,id=f3eac6a4-578b-4544-b899-b34007452c34,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf3eac6a4-57')
Sep 30 07:31:58 compute-0 nova_compute[189265]: 2025-09-30 07:31:58.660 2 INFO nova.virt.libvirt.driver [None req-c2010f80-d654-4a0a-a12b-83b4174f8103 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 8b5264fb-4374-456c-aa18-276d431aa425] Deleting instance files /var/lib/nova/instances/8b5264fb-4374-456c-aa18-276d431aa425_del
Sep 30 07:31:58 compute-0 nova_compute[189265]: 2025-09-30 07:31:58.660 2 INFO nova.virt.libvirt.driver [None req-c2010f80-d654-4a0a-a12b-83b4174f8103 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 8b5264fb-4374-456c-aa18-276d431aa425] Deletion of /var/lib/nova/instances/8b5264fb-4374-456c-aa18-276d431aa425_del complete
Sep 30 07:31:59 compute-0 nova_compute[189265]: 2025-09-30 07:31:59.177 2 INFO nova.compute.manager [None req-c2010f80-d654-4a0a-a12b-83b4174f8103 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 8b5264fb-4374-456c-aa18-276d431aa425] Took 1.33 seconds to destroy the instance on the hypervisor.
Sep 30 07:31:59 compute-0 nova_compute[189265]: 2025-09-30 07:31:59.178 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-c2010f80-d654-4a0a-a12b-83b4174f8103 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 07:31:59 compute-0 nova_compute[189265]: 2025-09-30 07:31:59.179 2 DEBUG nova.compute.manager [-] [instance: 8b5264fb-4374-456c-aa18-276d431aa425] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 07:31:59 compute-0 nova_compute[189265]: 2025-09-30 07:31:59.179 2 DEBUG nova.network.neutron [-] [instance: 8b5264fb-4374-456c-aa18-276d431aa425] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 07:31:59 compute-0 nova_compute[189265]: 2025-09-30 07:31:59.179 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:31:59 compute-0 podman[199733]: time="2025-09-30T07:31:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:31:59 compute-0 nova_compute[189265]: 2025-09-30 07:31:59.746 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:31:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:31:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:31:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:31:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3009 "" "Go-http-client/1.1"
Sep 30 07:32:00 compute-0 nova_compute[189265]: 2025-09-30 07:32:00.089 2 DEBUG nova.compute.manager [req-144cf297-02e2-4405-b5d0-0e997017917c req-4c41feea-25f2-4029-89a9-6e312fadcddc 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 8b5264fb-4374-456c-aa18-276d431aa425] Received event network-vif-deleted-f3eac6a4-578b-4544-b899-b34007452c34 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:32:00 compute-0 nova_compute[189265]: 2025-09-30 07:32:00.089 2 INFO nova.compute.manager [req-144cf297-02e2-4405-b5d0-0e997017917c req-4c41feea-25f2-4029-89a9-6e312fadcddc 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 8b5264fb-4374-456c-aa18-276d431aa425] Neutron deleted interface f3eac6a4-578b-4544-b899-b34007452c34; detaching it from the instance and deleting it from the info cache
Sep 30 07:32:00 compute-0 nova_compute[189265]: 2025-09-30 07:32:00.089 2 DEBUG nova.network.neutron [req-144cf297-02e2-4405-b5d0-0e997017917c req-4c41feea-25f2-4029-89a9-6e312fadcddc 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 8b5264fb-4374-456c-aa18-276d431aa425] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:32:00 compute-0 nova_compute[189265]: 2025-09-30 07:32:00.191 2 DEBUG nova.compute.manager [req-313227ad-80a4-471a-8a58-48ffb7301746 req-8cf470d6-7967-4cdc-9a27-0f344d67772b 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 8b5264fb-4374-456c-aa18-276d431aa425] Received event network-vif-unplugged-f3eac6a4-578b-4544-b899-b34007452c34 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:32:00 compute-0 nova_compute[189265]: 2025-09-30 07:32:00.192 2 DEBUG oslo_concurrency.lockutils [req-313227ad-80a4-471a-8a58-48ffb7301746 req-8cf470d6-7967-4cdc-9a27-0f344d67772b 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "8b5264fb-4374-456c-aa18-276d431aa425-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:32:00 compute-0 nova_compute[189265]: 2025-09-30 07:32:00.192 2 DEBUG oslo_concurrency.lockutils [req-313227ad-80a4-471a-8a58-48ffb7301746 req-8cf470d6-7967-4cdc-9a27-0f344d67772b 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "8b5264fb-4374-456c-aa18-276d431aa425-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:32:00 compute-0 nova_compute[189265]: 2025-09-30 07:32:00.192 2 DEBUG oslo_concurrency.lockutils [req-313227ad-80a4-471a-8a58-48ffb7301746 req-8cf470d6-7967-4cdc-9a27-0f344d67772b 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "8b5264fb-4374-456c-aa18-276d431aa425-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:32:00 compute-0 nova_compute[189265]: 2025-09-30 07:32:00.193 2 DEBUG nova.compute.manager [req-313227ad-80a4-471a-8a58-48ffb7301746 req-8cf470d6-7967-4cdc-9a27-0f344d67772b 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 8b5264fb-4374-456c-aa18-276d431aa425] No waiting events found dispatching network-vif-unplugged-f3eac6a4-578b-4544-b899-b34007452c34 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:32:00 compute-0 nova_compute[189265]: 2025-09-30 07:32:00.193 2 DEBUG nova.compute.manager [req-313227ad-80a4-471a-8a58-48ffb7301746 req-8cf470d6-7967-4cdc-9a27-0f344d67772b 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 8b5264fb-4374-456c-aa18-276d431aa425] Received event network-vif-unplugged-f3eac6a4-578b-4544-b899-b34007452c34 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:32:00 compute-0 nova_compute[189265]: 2025-09-30 07:32:00.534 2 DEBUG nova.network.neutron [-] [instance: 8b5264fb-4374-456c-aa18-276d431aa425] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:32:00 compute-0 nova_compute[189265]: 2025-09-30 07:32:00.597 2 DEBUG nova.compute.manager [req-144cf297-02e2-4405-b5d0-0e997017917c req-4c41feea-25f2-4029-89a9-6e312fadcddc 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 8b5264fb-4374-456c-aa18-276d431aa425] Detach interface failed, port_id=f3eac6a4-578b-4544-b899-b34007452c34, reason: Instance 8b5264fb-4374-456c-aa18-276d431aa425 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Sep 30 07:32:01 compute-0 nova_compute[189265]: 2025-09-30 07:32:01.042 2 INFO nova.compute.manager [-] [instance: 8b5264fb-4374-456c-aa18-276d431aa425] Took 1.86 seconds to deallocate network for instance.
Sep 30 07:32:01 compute-0 openstack_network_exporter[201859]: ERROR   07:32:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:32:01 compute-0 openstack_network_exporter[201859]: ERROR   07:32:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:32:01 compute-0 openstack_network_exporter[201859]: ERROR   07:32:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:32:01 compute-0 openstack_network_exporter[201859]: ERROR   07:32:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:32:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:32:01 compute-0 openstack_network_exporter[201859]: ERROR   07:32:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:32:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:32:01 compute-0 podman[219515]: 2025-09-30 07:32:01.472438851 +0000 UTC m=+0.055756061 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 07:32:01 compute-0 nova_compute[189265]: 2025-09-30 07:32:01.578 2 DEBUG oslo_concurrency.lockutils [None req-c2010f80-d654-4a0a-a12b-83b4174f8103 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:32:01 compute-0 nova_compute[189265]: 2025-09-30 07:32:01.579 2 DEBUG oslo_concurrency.lockutils [None req-c2010f80-d654-4a0a-a12b-83b4174f8103 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:32:01 compute-0 nova_compute[189265]: 2025-09-30 07:32:01.628 2 DEBUG nova.compute.provider_tree [None req-c2010f80-d654-4a0a-a12b-83b4174f8103 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:32:02 compute-0 nova_compute[189265]: 2025-09-30 07:32:02.135 2 DEBUG nova.scheduler.client.report [None req-c2010f80-d654-4a0a-a12b-83b4174f8103 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:32:02 compute-0 nova_compute[189265]: 2025-09-30 07:32:02.648 2 DEBUG oslo_concurrency.lockutils [None req-c2010f80-d654-4a0a-a12b-83b4174f8103 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.069s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:32:02 compute-0 nova_compute[189265]: 2025-09-30 07:32:02.688 2 INFO nova.scheduler.client.report [None req-c2010f80-d654-4a0a-a12b-83b4174f8103 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Deleted allocations for instance 8b5264fb-4374-456c-aa18-276d431aa425
Sep 30 07:32:03 compute-0 nova_compute[189265]: 2025-09-30 07:32:03.562 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:32:03 compute-0 nova_compute[189265]: 2025-09-30 07:32:03.654 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:32:03 compute-0 nova_compute[189265]: 2025-09-30 07:32:03.720 2 DEBUG oslo_concurrency.lockutils [None req-c2010f80-d654-4a0a-a12b-83b4174f8103 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "8b5264fb-4374-456c-aa18-276d431aa425" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.408s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:32:08 compute-0 nova_compute[189265]: 2025-09-30 07:32:08.564 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:32:08 compute-0 nova_compute[189265]: 2025-09-30 07:32:08.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:32:10 compute-0 podman[219540]: 2025-09-30 07:32:10.483477147 +0000 UTC m=+0.070173809 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4)
Sep 30 07:32:13 compute-0 podman[219560]: 2025-09-30 07:32:13.479604501 +0000 UTC m=+0.059554978 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, name=ubi9-minimal, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, distribution-scope=public, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 07:32:13 compute-0 nova_compute[189265]: 2025-09-30 07:32:13.565 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:32:13 compute-0 nova_compute[189265]: 2025-09-30 07:32:13.658 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:32:17 compute-0 podman[219582]: 2025-09-30 07:32:17.512712669 +0000 UTC m=+0.082842858 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Sep 30 07:32:17 compute-0 podman[219581]: 2025-09-30 07:32:17.524804892 +0000 UTC m=+0.099962703 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250930, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Sep 30 07:32:17 compute-0 podman[219583]: 2025-09-30 07:32:17.562642124 +0000 UTC m=+0.129419918 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 07:32:18 compute-0 nova_compute[189265]: 2025-09-30 07:32:18.567 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:32:18 compute-0 nova_compute[189265]: 2025-09-30 07:32:18.661 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:32:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:32:20.564 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:32:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:32:20.564 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:32:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:32:20.564 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:32:23 compute-0 nova_compute[189265]: 2025-09-30 07:32:23.568 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:32:23 compute-0 nova_compute[189265]: 2025-09-30 07:32:23.662 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:32:28 compute-0 nova_compute[189265]: 2025-09-30 07:32:28.571 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:32:28 compute-0 nova_compute[189265]: 2025-09-30 07:32:28.664 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:32:29 compute-0 podman[199733]: time="2025-09-30T07:32:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:32:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:32:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:32:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:32:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3011 "" "Go-http-client/1.1"
Sep 30 07:32:31 compute-0 openstack_network_exporter[201859]: ERROR   07:32:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:32:31 compute-0 openstack_network_exporter[201859]: ERROR   07:32:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:32:31 compute-0 openstack_network_exporter[201859]: ERROR   07:32:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:32:31 compute-0 openstack_network_exporter[201859]: ERROR   07:32:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:32:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:32:31 compute-0 openstack_network_exporter[201859]: ERROR   07:32:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:32:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:32:32 compute-0 podman[219644]: 2025-09-30 07:32:32.477294328 +0000 UTC m=+0.061132881 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 07:32:32 compute-0 nova_compute[189265]: 2025-09-30 07:32:32.509 2 DEBUG oslo_concurrency.lockutils [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquiring lock "3dbea315-3898-49bb-843e-b31c235e99e0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:32:32 compute-0 nova_compute[189265]: 2025-09-30 07:32:32.509 2 DEBUG oslo_concurrency.lockutils [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "3dbea315-3898-49bb-843e-b31c235e99e0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:32:33 compute-0 nova_compute[189265]: 2025-09-30 07:32:33.014 2 DEBUG nova.compute.manager [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Sep 30 07:32:33 compute-0 nova_compute[189265]: 2025-09-30 07:32:33.573 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:32:33 compute-0 nova_compute[189265]: 2025-09-30 07:32:33.599 2 DEBUG oslo_concurrency.lockutils [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:32:33 compute-0 nova_compute[189265]: 2025-09-30 07:32:33.600 2 DEBUG oslo_concurrency.lockutils [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:32:33 compute-0 nova_compute[189265]: 2025-09-30 07:32:33.610 2 DEBUG nova.virt.hardware [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Sep 30 07:32:33 compute-0 nova_compute[189265]: 2025-09-30 07:32:33.611 2 INFO nova.compute.claims [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Claim successful on node compute-0.ctlplane.example.com
Sep 30 07:32:33 compute-0 nova_compute[189265]: 2025-09-30 07:32:33.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:32:34 compute-0 nova_compute[189265]: 2025-09-30 07:32:34.673 2 DEBUG nova.compute.provider_tree [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:32:35 compute-0 nova_compute[189265]: 2025-09-30 07:32:35.181 2 DEBUG nova.scheduler.client.report [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:32:35 compute-0 nova_compute[189265]: 2025-09-30 07:32:35.694 2 DEBUG oslo_concurrency.lockutils [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.094s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:32:35 compute-0 nova_compute[189265]: 2025-09-30 07:32:35.695 2 DEBUG nova.compute.manager [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Sep 30 07:32:36 compute-0 nova_compute[189265]: 2025-09-30 07:32:36.204 2 DEBUG nova.compute.manager [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Sep 30 07:32:36 compute-0 nova_compute[189265]: 2025-09-30 07:32:36.205 2 DEBUG nova.network.neutron [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Sep 30 07:32:36 compute-0 nova_compute[189265]: 2025-09-30 07:32:36.205 2 WARNING neutronclient.v2_0.client [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:32:36 compute-0 nova_compute[189265]: 2025-09-30 07:32:36.205 2 WARNING neutronclient.v2_0.client [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:32:36 compute-0 nova_compute[189265]: 2025-09-30 07:32:36.713 2 INFO nova.virt.libvirt.driver [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 07:32:37 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:32:37.199 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '1a:26:7c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2e:60:fa:91:d0:34'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:32:37 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:32:37.200 100322 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 07:32:37 compute-0 nova_compute[189265]: 2025-09-30 07:32:37.201 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:32:37 compute-0 nova_compute[189265]: 2025-09-30 07:32:37.223 2 DEBUG nova.compute.manager [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Sep 30 07:32:37 compute-0 nova_compute[189265]: 2025-09-30 07:32:37.601 2 DEBUG nova.network.neutron [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Successfully created port: 3ba5e068-43ce-405c-886f-070951e83cf3 _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Sep 30 07:32:38 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:32:38.202 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=01429670-4ea1-4dab-babc-4bc628cc01bb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:32:38 compute-0 nova_compute[189265]: 2025-09-30 07:32:38.232 2 DEBUG nova.network.neutron [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Successfully updated port: 3ba5e068-43ce-405c-886f-070951e83cf3 _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Sep 30 07:32:38 compute-0 nova_compute[189265]: 2025-09-30 07:32:38.246 2 DEBUG nova.compute.manager [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Sep 30 07:32:38 compute-0 nova_compute[189265]: 2025-09-30 07:32:38.248 2 DEBUG nova.virt.libvirt.driver [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Sep 30 07:32:38 compute-0 nova_compute[189265]: 2025-09-30 07:32:38.249 2 INFO nova.virt.libvirt.driver [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Creating image(s)
Sep 30 07:32:38 compute-0 nova_compute[189265]: 2025-09-30 07:32:38.249 2 DEBUG oslo_concurrency.lockutils [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquiring lock "/var/lib/nova/instances/3dbea315-3898-49bb-843e-b31c235e99e0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:32:38 compute-0 nova_compute[189265]: 2025-09-30 07:32:38.250 2 DEBUG oslo_concurrency.lockutils [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "/var/lib/nova/instances/3dbea315-3898-49bb-843e-b31c235e99e0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:32:38 compute-0 nova_compute[189265]: 2025-09-30 07:32:38.251 2 DEBUG oslo_concurrency.lockutils [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "/var/lib/nova/instances/3dbea315-3898-49bb-843e-b31c235e99e0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:32:38 compute-0 nova_compute[189265]: 2025-09-30 07:32:38.252 2 DEBUG oslo_utils.imageutils.format_inspector [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:32:38 compute-0 nova_compute[189265]: 2025-09-30 07:32:38.258 2 DEBUG oslo_utils.imageutils.format_inspector [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:32:38 compute-0 nova_compute[189265]: 2025-09-30 07:32:38.261 2 DEBUG oslo_concurrency.processutils [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:32:38 compute-0 nova_compute[189265]: 2025-09-30 07:32:38.304 2 DEBUG nova.compute.manager [req-37931471-b4de-4b9e-852c-9e2f149b0ae8 req-4ae8aeb5-d0ee-4eb5-b893-2b299d48c7bf 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Received event network-changed-3ba5e068-43ce-405c-886f-070951e83cf3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:32:38 compute-0 nova_compute[189265]: 2025-09-30 07:32:38.305 2 DEBUG nova.compute.manager [req-37931471-b4de-4b9e-852c-9e2f149b0ae8 req-4ae8aeb5-d0ee-4eb5-b893-2b299d48c7bf 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Refreshing instance network info cache due to event network-changed-3ba5e068-43ce-405c-886f-070951e83cf3. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 07:32:38 compute-0 nova_compute[189265]: 2025-09-30 07:32:38.305 2 DEBUG oslo_concurrency.lockutils [req-37931471-b4de-4b9e-852c-9e2f149b0ae8 req-4ae8aeb5-d0ee-4eb5-b893-2b299d48c7bf 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "refresh_cache-3dbea315-3898-49bb-843e-b31c235e99e0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:32:38 compute-0 nova_compute[189265]: 2025-09-30 07:32:38.306 2 DEBUG oslo_concurrency.lockutils [req-37931471-b4de-4b9e-852c-9e2f149b0ae8 req-4ae8aeb5-d0ee-4eb5-b893-2b299d48c7bf 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquired lock "refresh_cache-3dbea315-3898-49bb-843e-b31c235e99e0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:32:38 compute-0 nova_compute[189265]: 2025-09-30 07:32:38.306 2 DEBUG nova.network.neutron [req-37931471-b4de-4b9e-852c-9e2f149b0ae8 req-4ae8aeb5-d0ee-4eb5-b893-2b299d48c7bf 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Refreshing network info cache for port 3ba5e068-43ce-405c-886f-070951e83cf3 _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 07:32:38 compute-0 nova_compute[189265]: 2025-09-30 07:32:38.342 2 DEBUG oslo_concurrency.processutils [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:32:38 compute-0 nova_compute[189265]: 2025-09-30 07:32:38.343 2 DEBUG oslo_concurrency.lockutils [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquiring lock "649c128805005f3dfb5a93843c58a367cdfe939d" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:32:38 compute-0 nova_compute[189265]: 2025-09-30 07:32:38.344 2 DEBUG oslo_concurrency.lockutils [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "649c128805005f3dfb5a93843c58a367cdfe939d" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:32:38 compute-0 nova_compute[189265]: 2025-09-30 07:32:38.345 2 DEBUG oslo_utils.imageutils.format_inspector [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:32:38 compute-0 nova_compute[189265]: 2025-09-30 07:32:38.351 2 DEBUG oslo_utils.imageutils.format_inspector [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:32:38 compute-0 nova_compute[189265]: 2025-09-30 07:32:38.352 2 DEBUG oslo_concurrency.processutils [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:32:38 compute-0 nova_compute[189265]: 2025-09-30 07:32:38.450 2 DEBUG oslo_concurrency.processutils [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:32:38 compute-0 nova_compute[189265]: 2025-09-30 07:32:38.451 2 DEBUG oslo_concurrency.processutils [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d,backing_fmt=raw /var/lib/nova/instances/3dbea315-3898-49bb-843e-b31c235e99e0/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:32:38 compute-0 nova_compute[189265]: 2025-09-30 07:32:38.487 2 DEBUG oslo_concurrency.processutils [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d,backing_fmt=raw /var/lib/nova/instances/3dbea315-3898-49bb-843e-b31c235e99e0/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:32:38 compute-0 nova_compute[189265]: 2025-09-30 07:32:38.488 2 DEBUG oslo_concurrency.lockutils [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "649c128805005f3dfb5a93843c58a367cdfe939d" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.145s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:32:38 compute-0 nova_compute[189265]: 2025-09-30 07:32:38.489 2 DEBUG oslo_concurrency.processutils [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:32:38 compute-0 nova_compute[189265]: 2025-09-30 07:32:38.548 2 DEBUG oslo_concurrency.processutils [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:32:38 compute-0 nova_compute[189265]: 2025-09-30 07:32:38.549 2 DEBUG nova.virt.disk.api [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Checking if we can resize image /var/lib/nova/instances/3dbea315-3898-49bb-843e-b31c235e99e0/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Sep 30 07:32:38 compute-0 nova_compute[189265]: 2025-09-30 07:32:38.550 2 DEBUG oslo_concurrency.processutils [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3dbea315-3898-49bb-843e-b31c235e99e0/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:32:38 compute-0 nova_compute[189265]: 2025-09-30 07:32:38.573 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:32:38 compute-0 nova_compute[189265]: 2025-09-30 07:32:38.601 2 DEBUG oslo_concurrency.processutils [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3dbea315-3898-49bb-843e-b31c235e99e0/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:32:38 compute-0 nova_compute[189265]: 2025-09-30 07:32:38.602 2 DEBUG nova.virt.disk.api [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Cannot resize image /var/lib/nova/instances/3dbea315-3898-49bb-843e-b31c235e99e0/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Sep 30 07:32:38 compute-0 nova_compute[189265]: 2025-09-30 07:32:38.602 2 DEBUG nova.virt.libvirt.driver [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Sep 30 07:32:38 compute-0 nova_compute[189265]: 2025-09-30 07:32:38.603 2 DEBUG nova.virt.libvirt.driver [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Ensure instance console log exists: /var/lib/nova/instances/3dbea315-3898-49bb-843e-b31c235e99e0/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 07:32:38 compute-0 nova_compute[189265]: 2025-09-30 07:32:38.603 2 DEBUG oslo_concurrency.lockutils [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:32:38 compute-0 nova_compute[189265]: 2025-09-30 07:32:38.603 2 DEBUG oslo_concurrency.lockutils [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:32:38 compute-0 nova_compute[189265]: 2025-09-30 07:32:38.604 2 DEBUG oslo_concurrency.lockutils [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:32:38 compute-0 nova_compute[189265]: 2025-09-30 07:32:38.666 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:32:38 compute-0 nova_compute[189265]: 2025-09-30 07:32:38.739 2 DEBUG oslo_concurrency.lockutils [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquiring lock "refresh_cache-3dbea315-3898-49bb-843e-b31c235e99e0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:32:38 compute-0 nova_compute[189265]: 2025-09-30 07:32:38.814 2 WARNING neutronclient.v2_0.client [req-37931471-b4de-4b9e-852c-9e2f149b0ae8 req-4ae8aeb5-d0ee-4eb5-b893-2b299d48c7bf 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:32:39 compute-0 nova_compute[189265]: 2025-09-30 07:32:39.361 2 DEBUG nova.network.neutron [req-37931471-b4de-4b9e-852c-9e2f149b0ae8 req-4ae8aeb5-d0ee-4eb5-b893-2b299d48c7bf 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 07:32:39 compute-0 nova_compute[189265]: 2025-09-30 07:32:39.562 2 DEBUG nova.network.neutron [req-37931471-b4de-4b9e-852c-9e2f149b0ae8 req-4ae8aeb5-d0ee-4eb5-b893-2b299d48c7bf 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:32:40 compute-0 nova_compute[189265]: 2025-09-30 07:32:40.068 2 DEBUG oslo_concurrency.lockutils [req-37931471-b4de-4b9e-852c-9e2f149b0ae8 req-4ae8aeb5-d0ee-4eb5-b893-2b299d48c7bf 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Releasing lock "refresh_cache-3dbea315-3898-49bb-843e-b31c235e99e0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:32:40 compute-0 nova_compute[189265]: 2025-09-30 07:32:40.070 2 DEBUG oslo_concurrency.lockutils [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquired lock "refresh_cache-3dbea315-3898-49bb-843e-b31c235e99e0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:32:40 compute-0 nova_compute[189265]: 2025-09-30 07:32:40.070 2 DEBUG nova.network.neutron [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 07:32:40 compute-0 nova_compute[189265]: 2025-09-30 07:32:40.748 2 DEBUG nova.network.neutron [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 07:32:41 compute-0 nova_compute[189265]: 2025-09-30 07:32:41.026 2 WARNING neutronclient.v2_0.client [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:32:41 compute-0 nova_compute[189265]: 2025-09-30 07:32:41.174 2 DEBUG nova.network.neutron [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Updating instance_info_cache with network_info: [{"id": "3ba5e068-43ce-405c-886f-070951e83cf3", "address": "fa:16:3e:11:0c:b9", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ba5e068-43", "ovs_interfaceid": "3ba5e068-43ce-405c-886f-070951e83cf3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:32:41 compute-0 podman[219686]: 2025-09-30 07:32:41.513048197 +0000 UTC m=+0.088587332 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0)
Sep 30 07:32:41 compute-0 nova_compute[189265]: 2025-09-30 07:32:41.681 2 DEBUG oslo_concurrency.lockutils [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Releasing lock "refresh_cache-3dbea315-3898-49bb-843e-b31c235e99e0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:32:41 compute-0 nova_compute[189265]: 2025-09-30 07:32:41.682 2 DEBUG nova.compute.manager [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Instance network_info: |[{"id": "3ba5e068-43ce-405c-886f-070951e83cf3", "address": "fa:16:3e:11:0c:b9", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ba5e068-43", "ovs_interfaceid": "3ba5e068-43ce-405c-886f-070951e83cf3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Sep 30 07:32:41 compute-0 nova_compute[189265]: 2025-09-30 07:32:41.686 2 DEBUG nova.virt.libvirt.driver [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Start _get_guest_xml network_info=[{"id": "3ba5e068-43ce-405c-886f-070951e83cf3", "address": "fa:16:3e:11:0c:b9", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ba5e068-43", "ovs_interfaceid": "3ba5e068-43ce-405c-886f-070951e83cf3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T07:07:59Z,direct_url=<?>,disk_format='qcow2',id=0c6b92f5-9861-49e4-862d-3ffd84520dfa,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4049964ce8244dacb50493f6676c6613',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T07:08:00Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'encryption_format': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encrypted': False, 'image_id': '0c6b92f5-9861-49e4-862d-3ffd84520dfa'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Sep 30 07:32:41 compute-0 nova_compute[189265]: 2025-09-30 07:32:41.692 2 WARNING nova.virt.libvirt.driver [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:32:41 compute-0 nova_compute[189265]: 2025-09-30 07:32:41.694 2 DEBUG nova.virt.driver [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='0c6b92f5-9861-49e4-862d-3ffd84520dfa', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteStrategies-server-1531325207', uuid='3dbea315-3898-49bb-843e-b31c235e99e0'), owner=OwnerMeta(userid='89ba5d19014145188ad2a3c812acdc88', username='tempest-TestExecuteStrategies-1096120513-project-admin', projectid='6431607f3dce4c88bbf6d17ee6cd45b2', projectname='tempest-TestExecuteStrategies-1096120513'), image=ImageMeta(id='0c6b92f5-9861-49e4-862d-3ffd84520dfa', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='ded17455-f8fe-40c7-8dae-6f0a2b208ae0', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "3ba5e068-43ce-405c-886f-070951e83cf3", "address": "fa:16:3e:11:0c:b9", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ba5e068-43", "ovs_interfaceid": "3ba5e068-43ce-405c-886f-070951e83cf3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759217561.6938827) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Sep 30 07:32:41 compute-0 nova_compute[189265]: 2025-09-30 07:32:41.700 2 DEBUG nova.virt.libvirt.host [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Sep 30 07:32:41 compute-0 nova_compute[189265]: 2025-09-30 07:32:41.701 2 DEBUG nova.virt.libvirt.host [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Sep 30 07:32:41 compute-0 nova_compute[189265]: 2025-09-30 07:32:41.707 2 DEBUG nova.virt.libvirt.host [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Sep 30 07:32:41 compute-0 nova_compute[189265]: 2025-09-30 07:32:41.708 2 DEBUG nova.virt.libvirt.host [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Sep 30 07:32:41 compute-0 nova_compute[189265]: 2025-09-30 07:32:41.708 2 DEBUG nova.virt.libvirt.driver [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 07:32:41 compute-0 nova_compute[189265]: 2025-09-30 07:32:41.709 2 DEBUG nova.virt.hardware [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T07:07:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='ded17455-f8fe-40c7-8dae-6f0a2b208ae0',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T07:07:59Z,direct_url=<?>,disk_format='qcow2',id=0c6b92f5-9861-49e4-862d-3ffd84520dfa,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4049964ce8244dacb50493f6676c6613',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T07:08:00Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Sep 30 07:32:41 compute-0 nova_compute[189265]: 2025-09-30 07:32:41.709 2 DEBUG nova.virt.hardware [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Sep 30 07:32:41 compute-0 nova_compute[189265]: 2025-09-30 07:32:41.710 2 DEBUG nova.virt.hardware [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Sep 30 07:32:41 compute-0 nova_compute[189265]: 2025-09-30 07:32:41.710 2 DEBUG nova.virt.hardware [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Sep 30 07:32:41 compute-0 nova_compute[189265]: 2025-09-30 07:32:41.711 2 DEBUG nova.virt.hardware [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Sep 30 07:32:41 compute-0 nova_compute[189265]: 2025-09-30 07:32:41.711 2 DEBUG nova.virt.hardware [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Sep 30 07:32:41 compute-0 nova_compute[189265]: 2025-09-30 07:32:41.712 2 DEBUG nova.virt.hardware [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Sep 30 07:32:41 compute-0 nova_compute[189265]: 2025-09-30 07:32:41.712 2 DEBUG nova.virt.hardware [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Sep 30 07:32:41 compute-0 nova_compute[189265]: 2025-09-30 07:32:41.713 2 DEBUG nova.virt.hardware [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Sep 30 07:32:41 compute-0 nova_compute[189265]: 2025-09-30 07:32:41.713 2 DEBUG nova.virt.hardware [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Sep 30 07:32:41 compute-0 nova_compute[189265]: 2025-09-30 07:32:41.713 2 DEBUG nova.virt.hardware [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Sep 30 07:32:41 compute-0 nova_compute[189265]: 2025-09-30 07:32:41.720 2 DEBUG nova.virt.libvirt.vif [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T07:32:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1531325207',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1531325207',id=19,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6431607f3dce4c88bbf6d17ee6cd45b2',ramdisk_id='',reservation_id='r-kqrx0v8v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1096120513',owner_user_name='tempest-TestExecuteStrategies-1096120513-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T07:32:37Z,user_data=None,user_id='89ba5d19014145188ad2a3c812acdc88',uuid=3dbea315-3898-49bb-843e-b31c235e99e0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3ba5e068-43ce-405c-886f-070951e83cf3", "address": "fa:16:3e:11:0c:b9", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ba5e068-43", "ovs_interfaceid": "3ba5e068-43ce-405c-886f-070951e83cf3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 07:32:41 compute-0 nova_compute[189265]: 2025-09-30 07:32:41.721 2 DEBUG nova.network.os_vif_util [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Converting VIF {"id": "3ba5e068-43ce-405c-886f-070951e83cf3", "address": "fa:16:3e:11:0c:b9", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ba5e068-43", "ovs_interfaceid": "3ba5e068-43ce-405c-886f-070951e83cf3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:32:41 compute-0 nova_compute[189265]: 2025-09-30 07:32:41.722 2 DEBUG nova.network.os_vif_util [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:11:0c:b9,bridge_name='br-int',has_traffic_filtering=True,id=3ba5e068-43ce-405c-886f-070951e83cf3,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ba5e068-43') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:32:41 compute-0 nova_compute[189265]: 2025-09-30 07:32:41.723 2 DEBUG nova.objects.instance [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3dbea315-3898-49bb-843e-b31c235e99e0 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:32:41 compute-0 nova_compute[189265]: 2025-09-30 07:32:41.783 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:32:42 compute-0 nova_compute[189265]: 2025-09-30 07:32:42.236 2 DEBUG nova.virt.libvirt.driver [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] End _get_guest_xml xml=<domain type="kvm">
Sep 30 07:32:42 compute-0 nova_compute[189265]:   <uuid>3dbea315-3898-49bb-843e-b31c235e99e0</uuid>
Sep 30 07:32:42 compute-0 nova_compute[189265]:   <name>instance-00000013</name>
Sep 30 07:32:42 compute-0 nova_compute[189265]:   <memory>131072</memory>
Sep 30 07:32:42 compute-0 nova_compute[189265]:   <vcpu>1</vcpu>
Sep 30 07:32:42 compute-0 nova_compute[189265]:   <metadata>
Sep 30 07:32:42 compute-0 nova_compute[189265]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 07:32:42 compute-0 nova_compute[189265]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:       <nova:name>tempest-TestExecuteStrategies-server-1531325207</nova:name>
Sep 30 07:32:42 compute-0 nova_compute[189265]:       <nova:creationTime>2025-09-30 07:32:41</nova:creationTime>
Sep 30 07:32:42 compute-0 nova_compute[189265]:       <nova:flavor name="m1.nano" id="ded17455-f8fe-40c7-8dae-6f0a2b208ae0">
Sep 30 07:32:42 compute-0 nova_compute[189265]:         <nova:memory>128</nova:memory>
Sep 30 07:32:42 compute-0 nova_compute[189265]:         <nova:disk>1</nova:disk>
Sep 30 07:32:42 compute-0 nova_compute[189265]:         <nova:swap>0</nova:swap>
Sep 30 07:32:42 compute-0 nova_compute[189265]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 07:32:42 compute-0 nova_compute[189265]:         <nova:vcpus>1</nova:vcpus>
Sep 30 07:32:42 compute-0 nova_compute[189265]:         <nova:extraSpecs>
Sep 30 07:32:42 compute-0 nova_compute[189265]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 07:32:42 compute-0 nova_compute[189265]:         </nova:extraSpecs>
Sep 30 07:32:42 compute-0 nova_compute[189265]:       </nova:flavor>
Sep 30 07:32:42 compute-0 nova_compute[189265]:       <nova:image uuid="0c6b92f5-9861-49e4-862d-3ffd84520dfa">
Sep 30 07:32:42 compute-0 nova_compute[189265]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 07:32:42 compute-0 nova_compute[189265]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 07:32:42 compute-0 nova_compute[189265]:         <nova:minDisk>1</nova:minDisk>
Sep 30 07:32:42 compute-0 nova_compute[189265]:         <nova:minRam>0</nova:minRam>
Sep 30 07:32:42 compute-0 nova_compute[189265]:         <nova:properties>
Sep 30 07:32:42 compute-0 nova_compute[189265]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 07:32:42 compute-0 nova_compute[189265]:         </nova:properties>
Sep 30 07:32:42 compute-0 nova_compute[189265]:       </nova:image>
Sep 30 07:32:42 compute-0 nova_compute[189265]:       <nova:owner>
Sep 30 07:32:42 compute-0 nova_compute[189265]:         <nova:user uuid="89ba5d19014145188ad2a3c812acdc88">tempest-TestExecuteStrategies-1096120513-project-admin</nova:user>
Sep 30 07:32:42 compute-0 nova_compute[189265]:         <nova:project uuid="6431607f3dce4c88bbf6d17ee6cd45b2">tempest-TestExecuteStrategies-1096120513</nova:project>
Sep 30 07:32:42 compute-0 nova_compute[189265]:       </nova:owner>
Sep 30 07:32:42 compute-0 nova_compute[189265]:       <nova:root type="image" uuid="0c6b92f5-9861-49e4-862d-3ffd84520dfa"/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:       <nova:ports>
Sep 30 07:32:42 compute-0 nova_compute[189265]:         <nova:port uuid="3ba5e068-43ce-405c-886f-070951e83cf3">
Sep 30 07:32:42 compute-0 nova_compute[189265]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:         </nova:port>
Sep 30 07:32:42 compute-0 nova_compute[189265]:       </nova:ports>
Sep 30 07:32:42 compute-0 nova_compute[189265]:     </nova:instance>
Sep 30 07:32:42 compute-0 nova_compute[189265]:   </metadata>
Sep 30 07:32:42 compute-0 nova_compute[189265]:   <sysinfo type="smbios">
Sep 30 07:32:42 compute-0 nova_compute[189265]:     <system>
Sep 30 07:32:42 compute-0 nova_compute[189265]:       <entry name="manufacturer">RDO</entry>
Sep 30 07:32:42 compute-0 nova_compute[189265]:       <entry name="product">OpenStack Compute</entry>
Sep 30 07:32:42 compute-0 nova_compute[189265]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 07:32:42 compute-0 nova_compute[189265]:       <entry name="serial">3dbea315-3898-49bb-843e-b31c235e99e0</entry>
Sep 30 07:32:42 compute-0 nova_compute[189265]:       <entry name="uuid">3dbea315-3898-49bb-843e-b31c235e99e0</entry>
Sep 30 07:32:42 compute-0 nova_compute[189265]:       <entry name="family">Virtual Machine</entry>
Sep 30 07:32:42 compute-0 nova_compute[189265]:     </system>
Sep 30 07:32:42 compute-0 nova_compute[189265]:   </sysinfo>
Sep 30 07:32:42 compute-0 nova_compute[189265]:   <os>
Sep 30 07:32:42 compute-0 nova_compute[189265]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 07:32:42 compute-0 nova_compute[189265]:     <boot dev="hd"/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:     <smbios mode="sysinfo"/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:   </os>
Sep 30 07:32:42 compute-0 nova_compute[189265]:   <features>
Sep 30 07:32:42 compute-0 nova_compute[189265]:     <acpi/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:     <apic/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:     <vmcoreinfo/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:   </features>
Sep 30 07:32:42 compute-0 nova_compute[189265]:   <clock offset="utc">
Sep 30 07:32:42 compute-0 nova_compute[189265]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:     <timer name="hpet" present="no"/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:   </clock>
Sep 30 07:32:42 compute-0 nova_compute[189265]:   <cpu mode="host-model" match="exact">
Sep 30 07:32:42 compute-0 nova_compute[189265]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:   </cpu>
Sep 30 07:32:42 compute-0 nova_compute[189265]:   <devices>
Sep 30 07:32:42 compute-0 nova_compute[189265]:     <disk type="file" device="disk">
Sep 30 07:32:42 compute-0 nova_compute[189265]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:       <source file="/var/lib/nova/instances/3dbea315-3898-49bb-843e-b31c235e99e0/disk"/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:       <target dev="vda" bus="virtio"/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:     </disk>
Sep 30 07:32:42 compute-0 nova_compute[189265]:     <disk type="file" device="cdrom">
Sep 30 07:32:42 compute-0 nova_compute[189265]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:       <source file="/var/lib/nova/instances/3dbea315-3898-49bb-843e-b31c235e99e0/disk.config"/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:       <target dev="sda" bus="sata"/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:     </disk>
Sep 30 07:32:42 compute-0 nova_compute[189265]:     <interface type="ethernet">
Sep 30 07:32:42 compute-0 nova_compute[189265]:       <mac address="fa:16:3e:11:0c:b9"/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:       <model type="virtio"/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:       <mtu size="1442"/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:       <target dev="tap3ba5e068-43"/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:     </interface>
Sep 30 07:32:42 compute-0 nova_compute[189265]:     <serial type="pty">
Sep 30 07:32:42 compute-0 nova_compute[189265]:       <log file="/var/lib/nova/instances/3dbea315-3898-49bb-843e-b31c235e99e0/console.log" append="off"/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:     </serial>
Sep 30 07:32:42 compute-0 nova_compute[189265]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:     <video>
Sep 30 07:32:42 compute-0 nova_compute[189265]:       <model type="virtio"/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:     </video>
Sep 30 07:32:42 compute-0 nova_compute[189265]:     <input type="tablet" bus="usb"/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:     <rng model="virtio">
Sep 30 07:32:42 compute-0 nova_compute[189265]:       <backend model="random">/dev/urandom</backend>
Sep 30 07:32:42 compute-0 nova_compute[189265]:     </rng>
Sep 30 07:32:42 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root"/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:     <controller type="usb" index="0"/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 07:32:42 compute-0 nova_compute[189265]:       <stats period="10"/>
Sep 30 07:32:42 compute-0 nova_compute[189265]:     </memballoon>
Sep 30 07:32:42 compute-0 nova_compute[189265]:   </devices>
Sep 30 07:32:42 compute-0 nova_compute[189265]: </domain>
Sep 30 07:32:42 compute-0 nova_compute[189265]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Sep 30 07:32:42 compute-0 nova_compute[189265]: 2025-09-30 07:32:42.238 2 DEBUG nova.compute.manager [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Preparing to wait for external event network-vif-plugged-3ba5e068-43ce-405c-886f-070951e83cf3 prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 07:32:42 compute-0 nova_compute[189265]: 2025-09-30 07:32:42.238 2 DEBUG oslo_concurrency.lockutils [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquiring lock "3dbea315-3898-49bb-843e-b31c235e99e0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:32:42 compute-0 nova_compute[189265]: 2025-09-30 07:32:42.238 2 DEBUG oslo_concurrency.lockutils [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "3dbea315-3898-49bb-843e-b31c235e99e0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:32:42 compute-0 nova_compute[189265]: 2025-09-30 07:32:42.239 2 DEBUG oslo_concurrency.lockutils [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "3dbea315-3898-49bb-843e-b31c235e99e0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:32:42 compute-0 nova_compute[189265]: 2025-09-30 07:32:42.240 2 DEBUG nova.virt.libvirt.vif [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T07:32:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1531325207',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1531325207',id=19,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6431607f3dce4c88bbf6d17ee6cd45b2',ramdisk_id='',reservation_id='r-kqrx0v8v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1096120513',owner_user_name='tempest-TestExecuteStrategies-1096120513-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T07:32:37Z,user_data=None,user_id='89ba5d19014145188ad2a3c812acdc88',uuid=3dbea315-3898-49bb-843e-b31c235e99e0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3ba5e068-43ce-405c-886f-070951e83cf3", "address": "fa:16:3e:11:0c:b9", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ba5e068-43", "ovs_interfaceid": "3ba5e068-43ce-405c-886f-070951e83cf3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 07:32:42 compute-0 nova_compute[189265]: 2025-09-30 07:32:42.240 2 DEBUG nova.network.os_vif_util [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Converting VIF {"id": "3ba5e068-43ce-405c-886f-070951e83cf3", "address": "fa:16:3e:11:0c:b9", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ba5e068-43", "ovs_interfaceid": "3ba5e068-43ce-405c-886f-070951e83cf3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:32:42 compute-0 nova_compute[189265]: 2025-09-30 07:32:42.242 2 DEBUG nova.network.os_vif_util [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:11:0c:b9,bridge_name='br-int',has_traffic_filtering=True,id=3ba5e068-43ce-405c-886f-070951e83cf3,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ba5e068-43') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:32:42 compute-0 nova_compute[189265]: 2025-09-30 07:32:42.242 2 DEBUG os_vif [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:11:0c:b9,bridge_name='br-int',has_traffic_filtering=True,id=3ba5e068-43ce-405c-886f-070951e83cf3,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ba5e068-43') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 07:32:42 compute-0 nova_compute[189265]: 2025-09-30 07:32:42.243 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:32:42 compute-0 nova_compute[189265]: 2025-09-30 07:32:42.244 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:32:42 compute-0 nova_compute[189265]: 2025-09-30 07:32:42.244 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:32:42 compute-0 nova_compute[189265]: 2025-09-30 07:32:42.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:32:42 compute-0 nova_compute[189265]: 2025-09-30 07:32:42.246 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'c18fb48a-913e-579a-8f7a-d16e9f55e7bc', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:32:42 compute-0 nova_compute[189265]: 2025-09-30 07:32:42.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:32:42 compute-0 nova_compute[189265]: 2025-09-30 07:32:42.252 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:32:42 compute-0 nova_compute[189265]: 2025-09-30 07:32:42.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:32:42 compute-0 nova_compute[189265]: 2025-09-30 07:32:42.256 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3ba5e068-43, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:32:42 compute-0 nova_compute[189265]: 2025-09-30 07:32:42.257 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap3ba5e068-43, col_values=(('qos', UUID('b9f0f4ed-bb7f-4651-a578-1ae2ecf06b95')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:32:42 compute-0 nova_compute[189265]: 2025-09-30 07:32:42.258 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap3ba5e068-43, col_values=(('external_ids', {'iface-id': '3ba5e068-43ce-405c-886f-070951e83cf3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:11:0c:b9', 'vm-uuid': '3dbea315-3898-49bb-843e-b31c235e99e0'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:32:42 compute-0 nova_compute[189265]: 2025-09-30 07:32:42.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:32:42 compute-0 NetworkManager[51813]: <info>  [1759217562.2613] manager: (tap3ba5e068-43): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/63)
Sep 30 07:32:42 compute-0 nova_compute[189265]: 2025-09-30 07:32:42.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:32:42 compute-0 nova_compute[189265]: 2025-09-30 07:32:42.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:32:42 compute-0 nova_compute[189265]: 2025-09-30 07:32:42.269 2 INFO os_vif [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:11:0c:b9,bridge_name='br-int',has_traffic_filtering=True,id=3ba5e068-43ce-405c-886f-070951e83cf3,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ba5e068-43')
Sep 30 07:32:43 compute-0 nova_compute[189265]: 2025-09-30 07:32:43.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:32:43 compute-0 nova_compute[189265]: 2025-09-30 07:32:43.835 2 DEBUG nova.virt.libvirt.driver [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 07:32:43 compute-0 nova_compute[189265]: 2025-09-30 07:32:43.836 2 DEBUG nova.virt.libvirt.driver [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 07:32:43 compute-0 nova_compute[189265]: 2025-09-30 07:32:43.836 2 DEBUG nova.virt.libvirt.driver [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] No VIF found with MAC fa:16:3e:11:0c:b9, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 07:32:43 compute-0 nova_compute[189265]: 2025-09-30 07:32:43.837 2 INFO nova.virt.libvirt.driver [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Using config drive
Sep 30 07:32:44 compute-0 nova_compute[189265]: 2025-09-30 07:32:44.351 2 WARNING neutronclient.v2_0.client [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:32:44 compute-0 podman[219709]: 2025-09-30 07:32:44.52504727 +0000 UTC m=+0.100775843 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, release=1755695350, config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64)
Sep 30 07:32:44 compute-0 nova_compute[189265]: 2025-09-30 07:32:44.531 2 INFO nova.virt.libvirt.driver [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Creating config drive at /var/lib/nova/instances/3dbea315-3898-49bb-843e-b31c235e99e0/disk.config
Sep 30 07:32:44 compute-0 nova_compute[189265]: 2025-09-30 07:32:44.537 2 DEBUG oslo_concurrency.processutils [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3dbea315-3898-49bb-843e-b31c235e99e0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmptu9ra_7y execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:32:44 compute-0 nova_compute[189265]: 2025-09-30 07:32:44.677 2 DEBUG oslo_concurrency.processutils [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3dbea315-3898-49bb-843e-b31c235e99e0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmptu9ra_7y" returned: 0 in 0.140s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:32:44 compute-0 kernel: tap3ba5e068-43: entered promiscuous mode
Sep 30 07:32:44 compute-0 NetworkManager[51813]: <info>  [1759217564.7618] manager: (tap3ba5e068-43): new Tun device (/org/freedesktop/NetworkManager/Devices/64)
Sep 30 07:32:44 compute-0 ovn_controller[91436]: 2025-09-30T07:32:44Z|00178|binding|INFO|Claiming lport 3ba5e068-43ce-405c-886f-070951e83cf3 for this chassis.
Sep 30 07:32:44 compute-0 ovn_controller[91436]: 2025-09-30T07:32:44Z|00179|binding|INFO|3ba5e068-43ce-405c-886f-070951e83cf3: Claiming fa:16:3e:11:0c:b9 10.100.0.5
Sep 30 07:32:44 compute-0 nova_compute[189265]: 2025-09-30 07:32:44.764 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:32:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:32:44.771 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:11:0c:b9 10.100.0.5'], port_security=['fa:16:3e:11:0c:b9 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '3dbea315-3898-49bb-843e-b31c235e99e0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c99c822b-3191-49e5-b938-903e25b4a9bb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6431607f3dce4c88bbf6d17ee6cd45b2', 'neutron:revision_number': '4', 'neutron:security_group_ids': '39e9818d-6ede-4a3d-b6e2-a5ad3a4c803a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0bbcb02d-e040-4e0e-9a60-6466c4420133, chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], logical_port=3ba5e068-43ce-405c-886f-070951e83cf3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:32:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:32:44.771 100322 INFO neutron.agent.ovn.metadata.agent [-] Port 3ba5e068-43ce-405c-886f-070951e83cf3 in datapath c99c822b-3191-49e5-b938-903e25b4a9bb bound to our chassis
Sep 30 07:32:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:32:44.773 100322 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c99c822b-3191-49e5-b938-903e25b4a9bb
Sep 30 07:32:44 compute-0 ovn_controller[91436]: 2025-09-30T07:32:44Z|00180|binding|INFO|Setting lport 3ba5e068-43ce-405c-886f-070951e83cf3 ovn-installed in OVS
Sep 30 07:32:44 compute-0 ovn_controller[91436]: 2025-09-30T07:32:44Z|00181|binding|INFO|Setting lport 3ba5e068-43ce-405c-886f-070951e83cf3 up in Southbound
Sep 30 07:32:44 compute-0 nova_compute[189265]: 2025-09-30 07:32:44.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:32:44 compute-0 nova_compute[189265]: 2025-09-30 07:32:44.789 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:32:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:32:44.788 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[ec0f9941-765c-42d7-9223-e71174f03deb]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:32:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:32:44.789 100322 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc99c822b-31 in ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Sep 30 07:32:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:32:44.794 210650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc99c822b-30 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Sep 30 07:32:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:32:44.794 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[0126c78d-a216-4af0-900b-e22f88eec6eb]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:32:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:32:44.795 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[3d781ae1-1d85-4ff8-b37b-a351d86bffa6]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:32:44 compute-0 systemd-udevd[219751]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 07:32:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:32:44.811 100440 DEBUG oslo.privsep.daemon [-] privsep: reply[1bd5937d-4bcc-44a3-92fc-c64ca9e3fe6a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:32:44 compute-0 systemd-machined[149233]: New machine qemu-14-instance-00000013.
Sep 30 07:32:44 compute-0 NetworkManager[51813]: <info>  [1759217564.8311] device (tap3ba5e068-43): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 07:32:44 compute-0 NetworkManager[51813]: <info>  [1759217564.8321] device (tap3ba5e068-43): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 07:32:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:32:44.831 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[0694c8ba-1de0-4033-a21e-ecdb2ce42c55]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:32:44 compute-0 systemd[1]: Started Virtual Machine qemu-14-instance-00000013.
Sep 30 07:32:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:32:44.867 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[54cbc97a-4971-4f86-b1b5-5bdbb334bcf2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:32:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:32:44.872 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[f0fa13ab-80d9-48d9-952c-37c7ab6dc6b5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:32:44 compute-0 systemd-udevd[219755]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 07:32:44 compute-0 NetworkManager[51813]: <info>  [1759217564.8765] manager: (tapc99c822b-30): new Veth device (/org/freedesktop/NetworkManager/Devices/65)
Sep 30 07:32:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:32:44.912 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[d4ff2384-4193-4923-9488-aab580680d88]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:32:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:32:44.915 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[913d6f35-29b2-4cfe-93c1-3a8b97e6893b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:32:44 compute-0 NetworkManager[51813]: <info>  [1759217564.9378] device (tapc99c822b-30): carrier: link connected
Sep 30 07:32:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:32:44.943 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[637c1be4-8d6b-4eb2-a4c9-09c55db7ef93]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:32:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:32:44.966 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[fd91c308-2e8e-4b2c-a8e1-2d99a44c0af7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc99c822b-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:67:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 46], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 542265, 'reachable_time': 25083, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219783, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:32:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:32:44.984 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[3e651d43-551c-475d-8a5a-539ab65c432b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe09:678c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 542265, 'tstamp': 542265}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219784, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:32:45 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:32:45.010 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[67edf105-8c6c-4189-ad2a-6d2cb2f7137b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc99c822b-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:67:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 46], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 542265, 'reachable_time': 25083, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219785, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:32:45 compute-0 nova_compute[189265]: 2025-09-30 07:32:45.020 2 DEBUG nova.compute.manager [req-e3330830-eb17-4fdc-9841-8dfa06b17643 req-39ed3735-b57f-4e9c-b8b5-1914595386f7 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Received event network-vif-plugged-3ba5e068-43ce-405c-886f-070951e83cf3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:32:45 compute-0 nova_compute[189265]: 2025-09-30 07:32:45.021 2 DEBUG oslo_concurrency.lockutils [req-e3330830-eb17-4fdc-9841-8dfa06b17643 req-39ed3735-b57f-4e9c-b8b5-1914595386f7 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "3dbea315-3898-49bb-843e-b31c235e99e0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:32:45 compute-0 nova_compute[189265]: 2025-09-30 07:32:45.022 2 DEBUG oslo_concurrency.lockutils [req-e3330830-eb17-4fdc-9841-8dfa06b17643 req-39ed3735-b57f-4e9c-b8b5-1914595386f7 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "3dbea315-3898-49bb-843e-b31c235e99e0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:32:45 compute-0 nova_compute[189265]: 2025-09-30 07:32:45.022 2 DEBUG oslo_concurrency.lockutils [req-e3330830-eb17-4fdc-9841-8dfa06b17643 req-39ed3735-b57f-4e9c-b8b5-1914595386f7 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "3dbea315-3898-49bb-843e-b31c235e99e0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:32:45 compute-0 nova_compute[189265]: 2025-09-30 07:32:45.023 2 DEBUG nova.compute.manager [req-e3330830-eb17-4fdc-9841-8dfa06b17643 req-39ed3735-b57f-4e9c-b8b5-1914595386f7 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Processing event network-vif-plugged-3ba5e068-43ce-405c-886f-070951e83cf3 _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 07:32:45 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:32:45.063 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[db6d0270-6751-4c2e-aee8-155a9a734176]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:32:45 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:32:45.164 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[33dc5d6f-cc0b-45a9-8052-c7f3b2b82616]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:32:45 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:32:45.166 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc99c822b-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:32:45 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:32:45.166 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:32:45 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:32:45.166 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc99c822b-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:32:45 compute-0 kernel: tapc99c822b-30: entered promiscuous mode
Sep 30 07:32:45 compute-0 nova_compute[189265]: 2025-09-30 07:32:45.168 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:32:45 compute-0 NetworkManager[51813]: <info>  [1759217565.1689] manager: (tapc99c822b-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/66)
Sep 30 07:32:45 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:32:45.171 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc99c822b-30, col_values=(('external_ids', {'iface-id': '67b7df48-3f38-444a-8506-1c0ec5bd1d15'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:32:45 compute-0 nova_compute[189265]: 2025-09-30 07:32:45.174 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:32:45 compute-0 ovn_controller[91436]: 2025-09-30T07:32:45Z|00182|binding|INFO|Releasing lport 67b7df48-3f38-444a-8506-1c0ec5bd1d15 from this chassis (sb_readonly=0)
Sep 30 07:32:45 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:32:45.176 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[198bcfbb-eea6-4d0c-b678-78c33a2b5ff1]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:32:45 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:32:45.176 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:32:45 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:32:45.176 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:32:45 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:32:45.176 100322 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for c99c822b-3191-49e5-b938-903e25b4a9bb disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Sep 30 07:32:45 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:32:45.177 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:32:45 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:32:45.177 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[5dd09f5c-8d50-4d84-92da-2205d9fde136]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:32:45 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:32:45.177 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:32:45 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:32:45.178 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[51ff0d18-927c-4002-9485-68297ed56c15]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:32:45 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:32:45.178 100322 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Sep 30 07:32:45 compute-0 ovn_metadata_agent[100317]: global
Sep 30 07:32:45 compute-0 ovn_metadata_agent[100317]:     log         /dev/log local0 debug
Sep 30 07:32:45 compute-0 ovn_metadata_agent[100317]:     log-tag     haproxy-metadata-proxy-c99c822b-3191-49e5-b938-903e25b4a9bb
Sep 30 07:32:45 compute-0 ovn_metadata_agent[100317]:     user        root
Sep 30 07:32:45 compute-0 ovn_metadata_agent[100317]:     group       root
Sep 30 07:32:45 compute-0 ovn_metadata_agent[100317]:     maxconn     1024
Sep 30 07:32:45 compute-0 ovn_metadata_agent[100317]:     pidfile     /var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy
Sep 30 07:32:45 compute-0 ovn_metadata_agent[100317]:     daemon
Sep 30 07:32:45 compute-0 ovn_metadata_agent[100317]: 
Sep 30 07:32:45 compute-0 ovn_metadata_agent[100317]: defaults
Sep 30 07:32:45 compute-0 ovn_metadata_agent[100317]:     log global
Sep 30 07:32:45 compute-0 ovn_metadata_agent[100317]:     mode http
Sep 30 07:32:45 compute-0 ovn_metadata_agent[100317]:     option httplog
Sep 30 07:32:45 compute-0 ovn_metadata_agent[100317]:     option dontlognull
Sep 30 07:32:45 compute-0 ovn_metadata_agent[100317]:     option http-server-close
Sep 30 07:32:45 compute-0 ovn_metadata_agent[100317]:     option forwardfor
Sep 30 07:32:45 compute-0 ovn_metadata_agent[100317]:     retries                 3
Sep 30 07:32:45 compute-0 ovn_metadata_agent[100317]:     timeout http-request    30s
Sep 30 07:32:45 compute-0 ovn_metadata_agent[100317]:     timeout connect         30s
Sep 30 07:32:45 compute-0 ovn_metadata_agent[100317]:     timeout client          32s
Sep 30 07:32:45 compute-0 ovn_metadata_agent[100317]:     timeout server          32s
Sep 30 07:32:45 compute-0 ovn_metadata_agent[100317]:     timeout http-keep-alive 30s
Sep 30 07:32:45 compute-0 ovn_metadata_agent[100317]: 
Sep 30 07:32:45 compute-0 ovn_metadata_agent[100317]: listen listener
Sep 30 07:32:45 compute-0 ovn_metadata_agent[100317]:     bind 169.254.169.254:80
Sep 30 07:32:45 compute-0 ovn_metadata_agent[100317]:     
Sep 30 07:32:45 compute-0 ovn_metadata_agent[100317]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 07:32:45 compute-0 ovn_metadata_agent[100317]: 
Sep 30 07:32:45 compute-0 ovn_metadata_agent[100317]:     http-request add-header X-OVN-Network-ID c99c822b-3191-49e5-b938-903e25b4a9bb
Sep 30 07:32:45 compute-0 ovn_metadata_agent[100317]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Sep 30 07:32:45 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:32:45.179 100322 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'env', 'PROCESS_TAG=haproxy-c99c822b-3191-49e5-b938-903e25b4a9bb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c99c822b-3191-49e5-b938-903e25b4a9bb.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Sep 30 07:32:45 compute-0 nova_compute[189265]: 2025-09-30 07:32:45.190 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:32:45 compute-0 podman[219817]: 2025-09-30 07:32:45.539858015 +0000 UTC m=+0.032506208 image pull eeebcc09bc72f81ab45f5ab87eb8f6a7b554b949227aeec082bdb0732754ddc8 38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 07:32:45 compute-0 podman[219817]: 2025-09-30 07:32:45.691920264 +0000 UTC m=+0.184568457 container create cafba4cf241096d989f2f80e0ad2b4f4549b49d8710d6344cfbc604109bf66f1 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20250930)
Sep 30 07:32:45 compute-0 systemd[1]: Started libpod-conmon-cafba4cf241096d989f2f80e0ad2b4f4549b49d8710d6344cfbc604109bf66f1.scope.
Sep 30 07:32:45 compute-0 nova_compute[189265]: 2025-09-30 07:32:45.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:32:45 compute-0 nova_compute[189265]: 2025-09-30 07:32:45.789 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:32:45 compute-0 nova_compute[189265]: 2025-09-30 07:32:45.790 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:32:45 compute-0 systemd[1]: Started libcrun container.
Sep 30 07:32:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c5b858e933b595d09be88bc21d91acdaac0facdc0d72ff7dc7b261bc58d2ca8e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 07:32:45 compute-0 podman[219817]: 2025-09-30 07:32:45.828987841 +0000 UTC m=+0.321636034 container init cafba4cf241096d989f2f80e0ad2b4f4549b49d8710d6344cfbc604109bf66f1 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4)
Sep 30 07:32:45 compute-0 podman[219817]: 2025-09-30 07:32:45.842717457 +0000 UTC m=+0.335365650 container start cafba4cf241096d989f2f80e0ad2b4f4549b49d8710d6344cfbc604109bf66f1 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Sep 30 07:32:45 compute-0 neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb[219838]: [NOTICE]   (219842) : New worker (219844) forked
Sep 30 07:32:45 compute-0 neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb[219838]: [NOTICE]   (219842) : Loading success.
Sep 30 07:32:46 compute-0 nova_compute[189265]: 2025-09-30 07:32:46.181 2 DEBUG nova.compute.manager [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 07:32:46 compute-0 nova_compute[189265]: 2025-09-30 07:32:46.185 2 DEBUG nova.virt.libvirt.driver [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Sep 30 07:32:46 compute-0 nova_compute[189265]: 2025-09-30 07:32:46.190 2 INFO nova.virt.libvirt.driver [-] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Instance spawned successfully.
Sep 30 07:32:46 compute-0 nova_compute[189265]: 2025-09-30 07:32:46.191 2 DEBUG nova.virt.libvirt.driver [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Sep 30 07:32:46 compute-0 nova_compute[189265]: 2025-09-30 07:32:46.722 2 DEBUG nova.virt.libvirt.driver [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:32:46 compute-0 nova_compute[189265]: 2025-09-30 07:32:46.723 2 DEBUG nova.virt.libvirt.driver [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:32:46 compute-0 nova_compute[189265]: 2025-09-30 07:32:46.723 2 DEBUG nova.virt.libvirt.driver [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:32:46 compute-0 nova_compute[189265]: 2025-09-30 07:32:46.724 2 DEBUG nova.virt.libvirt.driver [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:32:46 compute-0 nova_compute[189265]: 2025-09-30 07:32:46.725 2 DEBUG nova.virt.libvirt.driver [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:32:46 compute-0 nova_compute[189265]: 2025-09-30 07:32:46.726 2 DEBUG nova.virt.libvirt.driver [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:32:47 compute-0 nova_compute[189265]: 2025-09-30 07:32:47.086 2 DEBUG nova.compute.manager [req-9d226a69-74c3-4f6d-9e37-0e81276b77a0 req-d7c3f75b-a9b8-45da-91e7-619fc4fd021a 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Received event network-vif-plugged-3ba5e068-43ce-405c-886f-070951e83cf3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:32:47 compute-0 nova_compute[189265]: 2025-09-30 07:32:47.087 2 DEBUG oslo_concurrency.lockutils [req-9d226a69-74c3-4f6d-9e37-0e81276b77a0 req-d7c3f75b-a9b8-45da-91e7-619fc4fd021a 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "3dbea315-3898-49bb-843e-b31c235e99e0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:32:47 compute-0 nova_compute[189265]: 2025-09-30 07:32:47.088 2 DEBUG oslo_concurrency.lockutils [req-9d226a69-74c3-4f6d-9e37-0e81276b77a0 req-d7c3f75b-a9b8-45da-91e7-619fc4fd021a 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "3dbea315-3898-49bb-843e-b31c235e99e0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:32:47 compute-0 nova_compute[189265]: 2025-09-30 07:32:47.089 2 DEBUG oslo_concurrency.lockutils [req-9d226a69-74c3-4f6d-9e37-0e81276b77a0 req-d7c3f75b-a9b8-45da-91e7-619fc4fd021a 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "3dbea315-3898-49bb-843e-b31c235e99e0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:32:47 compute-0 nova_compute[189265]: 2025-09-30 07:32:47.089 2 DEBUG nova.compute.manager [req-9d226a69-74c3-4f6d-9e37-0e81276b77a0 req-d7c3f75b-a9b8-45da-91e7-619fc4fd021a 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] No waiting events found dispatching network-vif-plugged-3ba5e068-43ce-405c-886f-070951e83cf3 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:32:47 compute-0 nova_compute[189265]: 2025-09-30 07:32:47.090 2 WARNING nova.compute.manager [req-9d226a69-74c3-4f6d-9e37-0e81276b77a0 req-d7c3f75b-a9b8-45da-91e7-619fc4fd021a 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Received unexpected event network-vif-plugged-3ba5e068-43ce-405c-886f-070951e83cf3 for instance with vm_state building and task_state spawning.
Sep 30 07:32:47 compute-0 nova_compute[189265]: 2025-09-30 07:32:47.241 2 INFO nova.compute.manager [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Took 8.99 seconds to spawn the instance on the hypervisor.
Sep 30 07:32:47 compute-0 nova_compute[189265]: 2025-09-30 07:32:47.242 2 DEBUG nova.compute.manager [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 07:32:47 compute-0 nova_compute[189265]: 2025-09-30 07:32:47.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:32:47 compute-0 nova_compute[189265]: 2025-09-30 07:32:47.831 2 INFO nova.compute.manager [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Took 14.29 seconds to build instance.
Sep 30 07:32:48 compute-0 nova_compute[189265]: 2025-09-30 07:32:48.349 2 DEBUG oslo_concurrency.lockutils [None req-70321904-24ea-4d23-8fc2-b9ddcf445fb5 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "3dbea315-3898-49bb-843e-b31c235e99e0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.839s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:32:48 compute-0 podman[219854]: 2025-09-30 07:32:48.515371546 +0000 UTC m=+0.086414669 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Sep 30 07:32:48 compute-0 podman[219853]: 2025-09-30 07:32:48.537264116 +0000 UTC m=+0.111028027 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Sep 30 07:32:48 compute-0 podman[219855]: 2025-09-30 07:32:48.56587872 +0000 UTC m=+0.130107277 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4)
Sep 30 07:32:48 compute-0 nova_compute[189265]: 2025-09-30 07:32:48.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:32:51 compute-0 nova_compute[189265]: 2025-09-30 07:32:51.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:32:52 compute-0 nova_compute[189265]: 2025-09-30 07:32:52.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:32:52 compute-0 nova_compute[189265]: 2025-09-30 07:32:52.300 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:32:52 compute-0 nova_compute[189265]: 2025-09-30 07:32:52.300 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:32:52 compute-0 nova_compute[189265]: 2025-09-30 07:32:52.300 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:32:52 compute-0 nova_compute[189265]: 2025-09-30 07:32:52.301 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:32:53 compute-0 nova_compute[189265]: 2025-09-30 07:32:53.348 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3dbea315-3898-49bb-843e-b31c235e99e0/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:32:53 compute-0 nova_compute[189265]: 2025-09-30 07:32:53.417 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3dbea315-3898-49bb-843e-b31c235e99e0/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:32:53 compute-0 nova_compute[189265]: 2025-09-30 07:32:53.418 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3dbea315-3898-49bb-843e-b31c235e99e0/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:32:53 compute-0 nova_compute[189265]: 2025-09-30 07:32:53.476 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3dbea315-3898-49bb-843e-b31c235e99e0/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:32:53 compute-0 nova_compute[189265]: 2025-09-30 07:32:53.623 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:32:53 compute-0 nova_compute[189265]: 2025-09-30 07:32:53.667 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:32:53 compute-0 nova_compute[189265]: 2025-09-30 07:32:53.668 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:32:53 compute-0 nova_compute[189265]: 2025-09-30 07:32:53.685 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:32:53 compute-0 nova_compute[189265]: 2025-09-30 07:32:53.685 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5682MB free_disk=73.30304718017578GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:32:53 compute-0 nova_compute[189265]: 2025-09-30 07:32:53.686 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:32:53 compute-0 nova_compute[189265]: 2025-09-30 07:32:53.686 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:32:54 compute-0 nova_compute[189265]: 2025-09-30 07:32:54.752 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Instance 3dbea315-3898-49bb-843e-b31c235e99e0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 07:32:54 compute-0 nova_compute[189265]: 2025-09-30 07:32:54.752 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:32:54 compute-0 nova_compute[189265]: 2025-09-30 07:32:54.753 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:32:53 up  1:30,  0 user,  load average: 0.58, 0.28, 0.30\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_6431607f3dce4c88bbf6d17ee6cd45b2': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:32:54 compute-0 nova_compute[189265]: 2025-09-30 07:32:54.794 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:32:55 compute-0 nova_compute[189265]: 2025-09-30 07:32:55.301 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:32:55 compute-0 nova_compute[189265]: 2025-09-30 07:32:55.814 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:32:55 compute-0 nova_compute[189265]: 2025-09-30 07:32:55.814 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.128s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:32:56 compute-0 nova_compute[189265]: 2025-09-30 07:32:56.815 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:32:56 compute-0 nova_compute[189265]: 2025-09-30 07:32:56.816 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:32:57 compute-0 nova_compute[189265]: 2025-09-30 07:32:57.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:32:58 compute-0 nova_compute[189265]: 2025-09-30 07:32:58.625 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:32:58 compute-0 ovn_controller[91436]: 2025-09-30T07:32:58Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:11:0c:b9 10.100.0.5
Sep 30 07:32:58 compute-0 ovn_controller[91436]: 2025-09-30T07:32:58Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:11:0c:b9 10.100.0.5
Sep 30 07:32:58 compute-0 nova_compute[189265]: 2025-09-30 07:32:58.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:32:59 compute-0 nova_compute[189265]: 2025-09-30 07:32:59.432 2 DEBUG nova.virt.libvirt.driver [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: f680ae8a-3adb-4298-84c6-cae58224d553] Creating tmpfile /var/lib/nova/instances/tmpfz27_97u to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Sep 30 07:32:59 compute-0 nova_compute[189265]: 2025-09-30 07:32:59.433 2 WARNING neutronclient.v2_0.client [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:32:59 compute-0 nova_compute[189265]: 2025-09-30 07:32:59.437 2 DEBUG nova.compute.manager [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpfz27_97u',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Sep 30 07:32:59 compute-0 podman[199733]: time="2025-09-30T07:32:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:32:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:32:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 07:32:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:32:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3468 "" "Go-http-client/1.1"
Sep 30 07:33:01 compute-0 openstack_network_exporter[201859]: ERROR   07:33:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:33:01 compute-0 openstack_network_exporter[201859]: ERROR   07:33:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:33:01 compute-0 openstack_network_exporter[201859]: ERROR   07:33:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:33:01 compute-0 openstack_network_exporter[201859]: ERROR   07:33:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:33:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:33:01 compute-0 openstack_network_exporter[201859]: ERROR   07:33:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:33:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:33:01 compute-0 nova_compute[189265]: 2025-09-30 07:33:01.469 2 WARNING neutronclient.v2_0.client [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:33:02 compute-0 nova_compute[189265]: 2025-09-30 07:33:02.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:03 compute-0 podman[219938]: 2025-09-30 07:33:03.50917224 +0000 UTC m=+0.076722240 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 07:33:03 compute-0 nova_compute[189265]: 2025-09-30 07:33:03.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:05 compute-0 nova_compute[189265]: 2025-09-30 07:33:05.637 2 DEBUG nova.compute.manager [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpfz27_97u',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='f680ae8a-3adb-4298-84c6-cae58224d553',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Sep 30 07:33:06 compute-0 nova_compute[189265]: 2025-09-30 07:33:06.745 2 DEBUG oslo_concurrency.lockutils [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "refresh_cache-f680ae8a-3adb-4298-84c6-cae58224d553" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:33:06 compute-0 nova_compute[189265]: 2025-09-30 07:33:06.745 2 DEBUG oslo_concurrency.lockutils [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquired lock "refresh_cache-f680ae8a-3adb-4298-84c6-cae58224d553" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:33:06 compute-0 nova_compute[189265]: 2025-09-30 07:33:06.746 2 DEBUG nova.network.neutron [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: f680ae8a-3adb-4298-84c6-cae58224d553] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 07:33:07 compute-0 nova_compute[189265]: 2025-09-30 07:33:07.256 2 WARNING neutronclient.v2_0.client [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:33:07 compute-0 nova_compute[189265]: 2025-09-30 07:33:07.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:08 compute-0 nova_compute[189265]: 2025-09-30 07:33:08.001 2 WARNING neutronclient.v2_0.client [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:33:08 compute-0 nova_compute[189265]: 2025-09-30 07:33:08.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:08 compute-0 nova_compute[189265]: 2025-09-30 07:33:08.836 2 DEBUG nova.network.neutron [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: f680ae8a-3adb-4298-84c6-cae58224d553] Updating instance_info_cache with network_info: [{"id": "28b88230-ad9c-48dd-a487-28043768f2c6", "address": "fa:16:3e:23:30:dc", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap28b88230-ad", "ovs_interfaceid": "28b88230-ad9c-48dd-a487-28043768f2c6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:33:09 compute-0 nova_compute[189265]: 2025-09-30 07:33:09.344 2 DEBUG oslo_concurrency.lockutils [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Releasing lock "refresh_cache-f680ae8a-3adb-4298-84c6-cae58224d553" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:33:09 compute-0 nova_compute[189265]: 2025-09-30 07:33:09.364 2 DEBUG nova.virt.libvirt.driver [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: f680ae8a-3adb-4298-84c6-cae58224d553] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpfz27_97u',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='f680ae8a-3adb-4298-84c6-cae58224d553',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Sep 30 07:33:09 compute-0 nova_compute[189265]: 2025-09-30 07:33:09.365 2 DEBUG nova.virt.libvirt.driver [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: f680ae8a-3adb-4298-84c6-cae58224d553] Creating instance directory: /var/lib/nova/instances/f680ae8a-3adb-4298-84c6-cae58224d553 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Sep 30 07:33:09 compute-0 nova_compute[189265]: 2025-09-30 07:33:09.366 2 DEBUG nova.virt.libvirt.driver [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: f680ae8a-3adb-4298-84c6-cae58224d553] Creating disk.info with the contents: {'/var/lib/nova/instances/f680ae8a-3adb-4298-84c6-cae58224d553/disk': 'qcow2', '/var/lib/nova/instances/f680ae8a-3adb-4298-84c6-cae58224d553/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Sep 30 07:33:09 compute-0 nova_compute[189265]: 2025-09-30 07:33:09.367 2 DEBUG nova.virt.libvirt.driver [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: f680ae8a-3adb-4298-84c6-cae58224d553] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Sep 30 07:33:09 compute-0 nova_compute[189265]: 2025-09-30 07:33:09.368 2 DEBUG nova.objects.instance [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lazy-loading 'trusted_certs' on Instance uuid f680ae8a-3adb-4298-84c6-cae58224d553 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:33:09 compute-0 nova_compute[189265]: 2025-09-30 07:33:09.875 2 DEBUG oslo_utils.imageutils.format_inspector [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:33:09 compute-0 nova_compute[189265]: 2025-09-30 07:33:09.881 2 DEBUG oslo_utils.imageutils.format_inspector [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:33:09 compute-0 nova_compute[189265]: 2025-09-30 07:33:09.884 2 DEBUG oslo_concurrency.processutils [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:33:09 compute-0 nova_compute[189265]: 2025-09-30 07:33:09.967 2 DEBUG oslo_concurrency.processutils [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:33:09 compute-0 nova_compute[189265]: 2025-09-30 07:33:09.968 2 DEBUG oslo_concurrency.lockutils [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "649c128805005f3dfb5a93843c58a367cdfe939d" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:33:09 compute-0 nova_compute[189265]: 2025-09-30 07:33:09.969 2 DEBUG oslo_concurrency.lockutils [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "649c128805005f3dfb5a93843c58a367cdfe939d" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:33:09 compute-0 nova_compute[189265]: 2025-09-30 07:33:09.970 2 DEBUG oslo_utils.imageutils.format_inspector [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:33:09 compute-0 nova_compute[189265]: 2025-09-30 07:33:09.976 2 DEBUG oslo_utils.imageutils.format_inspector [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:33:09 compute-0 nova_compute[189265]: 2025-09-30 07:33:09.977 2 DEBUG oslo_concurrency.processutils [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:33:10 compute-0 nova_compute[189265]: 2025-09-30 07:33:10.041 2 DEBUG oslo_concurrency.processutils [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:33:10 compute-0 nova_compute[189265]: 2025-09-30 07:33:10.043 2 DEBUG oslo_concurrency.processutils [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d,backing_fmt=raw /var/lib/nova/instances/f680ae8a-3adb-4298-84c6-cae58224d553/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:33:10 compute-0 nova_compute[189265]: 2025-09-30 07:33:10.087 2 DEBUG oslo_concurrency.processutils [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d,backing_fmt=raw /var/lib/nova/instances/f680ae8a-3adb-4298-84c6-cae58224d553/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:33:10 compute-0 nova_compute[189265]: 2025-09-30 07:33:10.089 2 DEBUG oslo_concurrency.lockutils [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "649c128805005f3dfb5a93843c58a367cdfe939d" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.120s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:33:10 compute-0 nova_compute[189265]: 2025-09-30 07:33:10.090 2 DEBUG oslo_concurrency.processutils [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:33:10 compute-0 nova_compute[189265]: 2025-09-30 07:33:10.160 2 DEBUG oslo_concurrency.processutils [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:33:10 compute-0 nova_compute[189265]: 2025-09-30 07:33:10.160 2 DEBUG nova.virt.disk.api [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Checking if we can resize image /var/lib/nova/instances/f680ae8a-3adb-4298-84c6-cae58224d553/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Sep 30 07:33:10 compute-0 nova_compute[189265]: 2025-09-30 07:33:10.161 2 DEBUG oslo_concurrency.processutils [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f680ae8a-3adb-4298-84c6-cae58224d553/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:33:10 compute-0 nova_compute[189265]: 2025-09-30 07:33:10.229 2 DEBUG oslo_concurrency.processutils [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f680ae8a-3adb-4298-84c6-cae58224d553/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:33:10 compute-0 nova_compute[189265]: 2025-09-30 07:33:10.230 2 DEBUG nova.virt.disk.api [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Cannot resize image /var/lib/nova/instances/f680ae8a-3adb-4298-84c6-cae58224d553/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Sep 30 07:33:10 compute-0 nova_compute[189265]: 2025-09-30 07:33:10.230 2 DEBUG nova.objects.instance [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lazy-loading 'migration_context' on Instance uuid f680ae8a-3adb-4298-84c6-cae58224d553 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:33:10 compute-0 nova_compute[189265]: 2025-09-30 07:33:10.767 2 DEBUG nova.objects.base [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Object Instance<f680ae8a-3adb-4298-84c6-cae58224d553> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Sep 30 07:33:10 compute-0 nova_compute[189265]: 2025-09-30 07:33:10.768 2 DEBUG oslo_concurrency.processutils [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/f680ae8a-3adb-4298-84c6-cae58224d553/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:33:10 compute-0 nova_compute[189265]: 2025-09-30 07:33:10.797 2 DEBUG oslo_concurrency.processutils [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/f680ae8a-3adb-4298-84c6-cae58224d553/disk.config 497664" returned: 0 in 0.029s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:33:10 compute-0 nova_compute[189265]: 2025-09-30 07:33:10.798 2 DEBUG nova.virt.libvirt.driver [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: f680ae8a-3adb-4298-84c6-cae58224d553] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Sep 30 07:33:10 compute-0 nova_compute[189265]: 2025-09-30 07:33:10.800 2 DEBUG nova.virt.libvirt.vif [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T07:32:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1562644846',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1562644846',id=18,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T07:32:27Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6431607f3dce4c88bbf6d17ee6cd45b2',ramdisk_id='',reservation_id='r-szxf7qui',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1096120513',owner_user_name='tempest-TestExecuteStrategies-1096120513-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T07:32:27Z,user_data=None,user_id='89ba5d19014145188ad2a3c812acdc88',uuid=f680ae8a-3adb-4298-84c6-cae58224d553,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "28b88230-ad9c-48dd-a487-28043768f2c6", "address": "fa:16:3e:23:30:dc", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap28b88230-ad", "ovs_interfaceid": "28b88230-ad9c-48dd-a487-28043768f2c6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 07:33:10 compute-0 nova_compute[189265]: 2025-09-30 07:33:10.800 2 DEBUG nova.network.os_vif_util [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Converting VIF {"id": "28b88230-ad9c-48dd-a487-28043768f2c6", "address": "fa:16:3e:23:30:dc", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap28b88230-ad", "ovs_interfaceid": "28b88230-ad9c-48dd-a487-28043768f2c6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:33:10 compute-0 nova_compute[189265]: 2025-09-30 07:33:10.802 2 DEBUG nova.network.os_vif_util [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:23:30:dc,bridge_name='br-int',has_traffic_filtering=True,id=28b88230-ad9c-48dd-a487-28043768f2c6,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap28b88230-ad') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:33:10 compute-0 nova_compute[189265]: 2025-09-30 07:33:10.803 2 DEBUG os_vif [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:23:30:dc,bridge_name='br-int',has_traffic_filtering=True,id=28b88230-ad9c-48dd-a487-28043768f2c6,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap28b88230-ad') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 07:33:10 compute-0 nova_compute[189265]: 2025-09-30 07:33:10.803 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:10 compute-0 nova_compute[189265]: 2025-09-30 07:33:10.804 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:33:10 compute-0 nova_compute[189265]: 2025-09-30 07:33:10.804 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:33:10 compute-0 nova_compute[189265]: 2025-09-30 07:33:10.806 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:10 compute-0 nova_compute[189265]: 2025-09-30 07:33:10.806 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '9b3f6eab-1150-5685-bc26-4203292bb110', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:33:10 compute-0 nova_compute[189265]: 2025-09-30 07:33:10.842 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:10 compute-0 nova_compute[189265]: 2025-09-30 07:33:10.844 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:10 compute-0 nova_compute[189265]: 2025-09-30 07:33:10.848 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:10 compute-0 nova_compute[189265]: 2025-09-30 07:33:10.848 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap28b88230-ad, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:33:10 compute-0 nova_compute[189265]: 2025-09-30 07:33:10.849 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap28b88230-ad, col_values=(('qos', UUID('6c73f78f-f580-4986-b80a-6eabfe3a5b4a')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:33:10 compute-0 nova_compute[189265]: 2025-09-30 07:33:10.849 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap28b88230-ad, col_values=(('external_ids', {'iface-id': '28b88230-ad9c-48dd-a487-28043768f2c6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:23:30:dc', 'vm-uuid': 'f680ae8a-3adb-4298-84c6-cae58224d553'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:33:10 compute-0 nova_compute[189265]: 2025-09-30 07:33:10.851 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:10 compute-0 NetworkManager[51813]: <info>  [1759217590.8525] manager: (tap28b88230-ad): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/67)
Sep 30 07:33:10 compute-0 nova_compute[189265]: 2025-09-30 07:33:10.853 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:33:10 compute-0 nova_compute[189265]: 2025-09-30 07:33:10.862 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:10 compute-0 nova_compute[189265]: 2025-09-30 07:33:10.863 2 INFO os_vif [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:23:30:dc,bridge_name='br-int',has_traffic_filtering=True,id=28b88230-ad9c-48dd-a487-28043768f2c6,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap28b88230-ad')
Sep 30 07:33:10 compute-0 nova_compute[189265]: 2025-09-30 07:33:10.864 2 DEBUG nova.virt.libvirt.driver [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Sep 30 07:33:10 compute-0 nova_compute[189265]: 2025-09-30 07:33:10.864 2 DEBUG nova.compute.manager [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpfz27_97u',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='f680ae8a-3adb-4298-84c6-cae58224d553',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Sep 30 07:33:10 compute-0 nova_compute[189265]: 2025-09-30 07:33:10.865 2 WARNING neutronclient.v2_0.client [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:33:10 compute-0 nova_compute[189265]: 2025-09-30 07:33:10.994 2 WARNING neutronclient.v2_0.client [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:33:12 compute-0 podman[219982]: 2025-09-30 07:33:12.486607721 +0000 UTC m=+0.071442209 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, io.buildah.version=1.41.4, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Sep 30 07:33:12 compute-0 nova_compute[189265]: 2025-09-30 07:33:12.911 2 DEBUG nova.network.neutron [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: f680ae8a-3adb-4298-84c6-cae58224d553] Port 28b88230-ad9c-48dd-a487-28043768f2c6 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Sep 30 07:33:12 compute-0 nova_compute[189265]: 2025-09-30 07:33:12.930 2 DEBUG nova.compute.manager [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpfz27_97u',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='f680ae8a-3adb-4298-84c6-cae58224d553',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Sep 30 07:33:13 compute-0 nova_compute[189265]: 2025-09-30 07:33:13.661 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:14 compute-0 ovn_controller[91436]: 2025-09-30T07:33:14Z|00183|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Sep 30 07:33:15 compute-0 podman[220002]: 2025-09-30 07:33:15.493217197 +0000 UTC m=+0.075000751 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, managed_by=edpm_ansible, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, build-date=2025-08-20T13:12:41)
Sep 30 07:33:15 compute-0 kernel: tap28b88230-ad: entered promiscuous mode
Sep 30 07:33:15 compute-0 NetworkManager[51813]: <info>  [1759217595.8004] manager: (tap28b88230-ad): new Tun device (/org/freedesktop/NetworkManager/Devices/68)
Sep 30 07:33:15 compute-0 ovn_controller[91436]: 2025-09-30T07:33:15Z|00184|binding|INFO|Claiming lport 28b88230-ad9c-48dd-a487-28043768f2c6 for this additional chassis.
Sep 30 07:33:15 compute-0 ovn_controller[91436]: 2025-09-30T07:33:15Z|00185|binding|INFO|28b88230-ad9c-48dd-a487-28043768f2c6: Claiming fa:16:3e:23:30:dc 10.100.0.10
Sep 30 07:33:15 compute-0 nova_compute[189265]: 2025-09-30 07:33:15.802 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:15 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:15.812 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:23:30:dc 10.100.0.10'], port_security=['fa:16:3e:23:30:dc 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'f680ae8a-3adb-4298-84c6-cae58224d553', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c99c822b-3191-49e5-b938-903e25b4a9bb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6431607f3dce4c88bbf6d17ee6cd45b2', 'neutron:revision_number': '10', 'neutron:security_group_ids': '39e9818d-6ede-4a3d-b6e2-a5ad3a4c803a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0bbcb02d-e040-4e0e-9a60-6466c4420133, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=28b88230-ad9c-48dd-a487-28043768f2c6) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:33:15 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:15.813 100322 INFO neutron.agent.ovn.metadata.agent [-] Port 28b88230-ad9c-48dd-a487-28043768f2c6 in datapath c99c822b-3191-49e5-b938-903e25b4a9bb unbound from our chassis
Sep 30 07:33:15 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:15.816 100322 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c99c822b-3191-49e5-b938-903e25b4a9bb
Sep 30 07:33:15 compute-0 ovn_controller[91436]: 2025-09-30T07:33:15Z|00186|binding|INFO|Setting lport 28b88230-ad9c-48dd-a487-28043768f2c6 ovn-installed in OVS
Sep 30 07:33:15 compute-0 nova_compute[189265]: 2025-09-30 07:33:15.829 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:15 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:15.835 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[6e3ca148-233a-44fb-8dd0-1c80fd11876e]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:33:15 compute-0 nova_compute[189265]: 2025-09-30 07:33:15.836 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:15 compute-0 nova_compute[189265]: 2025-09-30 07:33:15.851 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:15 compute-0 systemd-udevd[220041]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 07:33:15 compute-0 systemd-machined[149233]: New machine qemu-15-instance-00000012.
Sep 30 07:33:15 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:15.880 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[bb3f3e70-9b65-4c55-b23e-0eb0415d2367]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:33:15 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:15.882 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[1db6be5e-1cb5-4abc-8d55-41227e9f6ad2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:33:15 compute-0 NetworkManager[51813]: <info>  [1759217595.8884] device (tap28b88230-ad): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 07:33:15 compute-0 systemd[1]: Started Virtual Machine qemu-15-instance-00000012.
Sep 30 07:33:15 compute-0 NetworkManager[51813]: <info>  [1759217595.8911] device (tap28b88230-ad): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 07:33:15 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:15.923 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[42cdd883-d8fa-4629-814d-189da89a53d3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:33:15 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:15.949 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[34b6f2ee-4a2c-430d-8d94-912f69137a6f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc99c822b-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:67:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 46], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 542265, 'reachable_time': 25083, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220046, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:33:15 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:15.972 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[0273893e-7444-4fd1-b998-f6987cab644c]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapc99c822b-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 542282, 'tstamp': 542282}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220049, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapc99c822b-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 542287, 'tstamp': 542287}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220049, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:33:15 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:15.975 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc99c822b-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:33:16 compute-0 nova_compute[189265]: 2025-09-30 07:33:16.012 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:16 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:16.014 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc99c822b-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:33:16 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:16.015 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:33:16 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:16.015 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc99c822b-30, col_values=(('external_ids', {'iface-id': '67b7df48-3f38-444a-8506-1c0ec5bd1d15'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:33:16 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:16.016 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:33:16 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:16.019 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[4d332ff8-e3ad-4f6b-8233-fbbf0b49e8ec]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-c99c822b-3191-49e5-b938-903e25b4a9bb\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID c99c822b-3191-49e5-b938-903e25b4a9bb\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:33:18 compute-0 nova_compute[189265]: 2025-09-30 07:33:18.679 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:19 compute-0 podman[220075]: 2025-09-30 07:33:19.52886395 +0000 UTC m=+0.097790047 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Sep 30 07:33:19 compute-0 podman[220076]: 2025-09-30 07:33:19.529002304 +0000 UTC m=+0.097743446 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_managed=true)
Sep 30 07:33:19 compute-0 podman[220077]: 2025-09-30 07:33:19.564418174 +0000 UTC m=+0.139590281 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Sep 30 07:33:20 compute-0 ovn_controller[91436]: 2025-09-30T07:33:20Z|00187|binding|INFO|Claiming lport 28b88230-ad9c-48dd-a487-28043768f2c6 for this chassis.
Sep 30 07:33:20 compute-0 ovn_controller[91436]: 2025-09-30T07:33:20Z|00188|binding|INFO|28b88230-ad9c-48dd-a487-28043768f2c6: Claiming fa:16:3e:23:30:dc 10.100.0.10
Sep 30 07:33:20 compute-0 ovn_controller[91436]: 2025-09-30T07:33:20Z|00189|binding|INFO|Setting lport 28b88230-ad9c-48dd-a487-28043768f2c6 up in Southbound
Sep 30 07:33:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:20.565 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:33:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:20.566 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:33:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:20.566 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:33:20 compute-0 nova_compute[189265]: 2025-09-30 07:33:20.853 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:21 compute-0 nova_compute[189265]: 2025-09-30 07:33:21.314 2 INFO nova.compute.manager [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: f680ae8a-3adb-4298-84c6-cae58224d553] Post operation of migration started
Sep 30 07:33:21 compute-0 nova_compute[189265]: 2025-09-30 07:33:21.315 2 WARNING neutronclient.v2_0.client [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:33:21 compute-0 nova_compute[189265]: 2025-09-30 07:33:21.765 2 WARNING neutronclient.v2_0.client [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:33:21 compute-0 nova_compute[189265]: 2025-09-30 07:33:21.766 2 WARNING neutronclient.v2_0.client [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:33:22 compute-0 nova_compute[189265]: 2025-09-30 07:33:22.760 2 DEBUG oslo_concurrency.lockutils [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "refresh_cache-f680ae8a-3adb-4298-84c6-cae58224d553" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:33:22 compute-0 nova_compute[189265]: 2025-09-30 07:33:22.762 2 DEBUG oslo_concurrency.lockutils [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquired lock "refresh_cache-f680ae8a-3adb-4298-84c6-cae58224d553" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:33:22 compute-0 nova_compute[189265]: 2025-09-30 07:33:22.762 2 DEBUG nova.network.neutron [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: f680ae8a-3adb-4298-84c6-cae58224d553] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 07:33:23 compute-0 nova_compute[189265]: 2025-09-30 07:33:23.269 2 WARNING neutronclient.v2_0.client [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:33:23 compute-0 nova_compute[189265]: 2025-09-30 07:33:23.716 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:24 compute-0 nova_compute[189265]: 2025-09-30 07:33:24.611 2 WARNING neutronclient.v2_0.client [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:33:24 compute-0 nova_compute[189265]: 2025-09-30 07:33:24.749 2 DEBUG nova.network.neutron [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: f680ae8a-3adb-4298-84c6-cae58224d553] Updating instance_info_cache with network_info: [{"id": "28b88230-ad9c-48dd-a487-28043768f2c6", "address": "fa:16:3e:23:30:dc", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap28b88230-ad", "ovs_interfaceid": "28b88230-ad9c-48dd-a487-28043768f2c6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:33:25 compute-0 nova_compute[189265]: 2025-09-30 07:33:25.271 2 DEBUG oslo_concurrency.lockutils [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Releasing lock "refresh_cache-f680ae8a-3adb-4298-84c6-cae58224d553" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:33:25 compute-0 nova_compute[189265]: 2025-09-30 07:33:25.792 2 DEBUG oslo_concurrency.lockutils [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:33:25 compute-0 nova_compute[189265]: 2025-09-30 07:33:25.793 2 DEBUG oslo_concurrency.lockutils [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:33:25 compute-0 nova_compute[189265]: 2025-09-30 07:33:25.793 2 DEBUG oslo_concurrency.lockutils [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:33:25 compute-0 nova_compute[189265]: 2025-09-30 07:33:25.798 2 INFO nova.virt.libvirt.driver [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: f680ae8a-3adb-4298-84c6-cae58224d553] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Sep 30 07:33:25 compute-0 virtqemud[189090]: Domain id=15 name='instance-00000012' uuid=f680ae8a-3adb-4298-84c6-cae58224d553 is tainted: custom-monitor
Sep 30 07:33:25 compute-0 nova_compute[189265]: 2025-09-30 07:33:25.854 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:26 compute-0 nova_compute[189265]: 2025-09-30 07:33:26.805 2 INFO nova.virt.libvirt.driver [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: f680ae8a-3adb-4298-84c6-cae58224d553] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Sep 30 07:33:27 compute-0 nova_compute[189265]: 2025-09-30 07:33:27.812 2 INFO nova.virt.libvirt.driver [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: f680ae8a-3adb-4298-84c6-cae58224d553] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Sep 30 07:33:27 compute-0 nova_compute[189265]: 2025-09-30 07:33:27.818 2 DEBUG nova.compute.manager [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: f680ae8a-3adb-4298-84c6-cae58224d553] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 07:33:28 compute-0 nova_compute[189265]: 2025-09-30 07:33:28.328 2 DEBUG nova.objects.instance [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: f680ae8a-3adb-4298-84c6-cae58224d553] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Sep 30 07:33:28 compute-0 nova_compute[189265]: 2025-09-30 07:33:28.719 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:29 compute-0 nova_compute[189265]: 2025-09-30 07:33:29.349 2 WARNING neutronclient.v2_0.client [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:33:29 compute-0 nova_compute[189265]: 2025-09-30 07:33:29.443 2 WARNING neutronclient.v2_0.client [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:33:29 compute-0 nova_compute[189265]: 2025-09-30 07:33:29.444 2 WARNING neutronclient.v2_0.client [None req-274bc090-9cfd-4f02-bb86-38da6d21070e e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:33:29 compute-0 podman[199733]: time="2025-09-30T07:33:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:33:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:33:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 07:33:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:33:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3472 "" "Go-http-client/1.1"
Sep 30 07:33:30 compute-0 nova_compute[189265]: 2025-09-30 07:33:30.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:31 compute-0 openstack_network_exporter[201859]: ERROR   07:33:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:33:31 compute-0 openstack_network_exporter[201859]: ERROR   07:33:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:33:31 compute-0 openstack_network_exporter[201859]: ERROR   07:33:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:33:31 compute-0 openstack_network_exporter[201859]: ERROR   07:33:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:33:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:33:31 compute-0 openstack_network_exporter[201859]: ERROR   07:33:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:33:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:33:33 compute-0 nova_compute[189265]: 2025-09-30 07:33:33.055 2 DEBUG oslo_concurrency.lockutils [None req-9a8953a7-bbe2-438f-8c1a-5603da129777 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquiring lock "3dbea315-3898-49bb-843e-b31c235e99e0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:33:33 compute-0 nova_compute[189265]: 2025-09-30 07:33:33.056 2 DEBUG oslo_concurrency.lockutils [None req-9a8953a7-bbe2-438f-8c1a-5603da129777 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "3dbea315-3898-49bb-843e-b31c235e99e0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:33:33 compute-0 nova_compute[189265]: 2025-09-30 07:33:33.057 2 DEBUG oslo_concurrency.lockutils [None req-9a8953a7-bbe2-438f-8c1a-5603da129777 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquiring lock "3dbea315-3898-49bb-843e-b31c235e99e0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:33:33 compute-0 nova_compute[189265]: 2025-09-30 07:33:33.057 2 DEBUG oslo_concurrency.lockutils [None req-9a8953a7-bbe2-438f-8c1a-5603da129777 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "3dbea315-3898-49bb-843e-b31c235e99e0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:33:33 compute-0 nova_compute[189265]: 2025-09-30 07:33:33.058 2 DEBUG oslo_concurrency.lockutils [None req-9a8953a7-bbe2-438f-8c1a-5603da129777 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "3dbea315-3898-49bb-843e-b31c235e99e0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:33:33 compute-0 nova_compute[189265]: 2025-09-30 07:33:33.074 2 INFO nova.compute.manager [None req-9a8953a7-bbe2-438f-8c1a-5603da129777 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Terminating instance
Sep 30 07:33:33 compute-0 nova_compute[189265]: 2025-09-30 07:33:33.593 2 DEBUG nova.compute.manager [None req-9a8953a7-bbe2-438f-8c1a-5603da129777 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 07:33:33 compute-0 kernel: tap3ba5e068-43 (unregistering): left promiscuous mode
Sep 30 07:33:33 compute-0 NetworkManager[51813]: <info>  [1759217613.6360] device (tap3ba5e068-43): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 07:33:33 compute-0 nova_compute[189265]: 2025-09-30 07:33:33.687 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:33 compute-0 ovn_controller[91436]: 2025-09-30T07:33:33Z|00190|binding|INFO|Releasing lport 3ba5e068-43ce-405c-886f-070951e83cf3 from this chassis (sb_readonly=0)
Sep 30 07:33:33 compute-0 ovn_controller[91436]: 2025-09-30T07:33:33Z|00191|binding|INFO|Setting lport 3ba5e068-43ce-405c-886f-070951e83cf3 down in Southbound
Sep 30 07:33:33 compute-0 ovn_controller[91436]: 2025-09-30T07:33:33Z|00192|binding|INFO|Removing iface tap3ba5e068-43 ovn-installed in OVS
Sep 30 07:33:33 compute-0 nova_compute[189265]: 2025-09-30 07:33:33.691 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:33.714 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:11:0c:b9 10.100.0.5'], port_security=['fa:16:3e:11:0c:b9 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '3dbea315-3898-49bb-843e-b31c235e99e0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c99c822b-3191-49e5-b938-903e25b4a9bb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6431607f3dce4c88bbf6d17ee6cd45b2', 'neutron:revision_number': '5', 'neutron:security_group_ids': '39e9818d-6ede-4a3d-b6e2-a5ad3a4c803a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0bbcb02d-e040-4e0e-9a60-6466c4420133, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], logical_port=3ba5e068-43ce-405c-886f-070951e83cf3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:33:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:33.715 100322 INFO neutron.agent.ovn.metadata.agent [-] Port 3ba5e068-43ce-405c-886f-070951e83cf3 in datapath c99c822b-3191-49e5-b938-903e25b4a9bb unbound from our chassis
Sep 30 07:33:33 compute-0 nova_compute[189265]: 2025-09-30 07:33:33.715 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:33.718 100322 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c99c822b-3191-49e5-b938-903e25b4a9bb
Sep 30 07:33:33 compute-0 nova_compute[189265]: 2025-09-30 07:33:33.718 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:33.731 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[028d92c2-ed92-4054-91b1-dac9207be048]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:33:33 compute-0 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000013.scope: Deactivated successfully.
Sep 30 07:33:33 compute-0 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000013.scope: Consumed 14.286s CPU time.
Sep 30 07:33:33 compute-0 systemd-machined[149233]: Machine qemu-14-instance-00000013 terminated.
Sep 30 07:33:33 compute-0 podman[220140]: 2025-09-30 07:33:33.742603697 +0000 UTC m=+0.083543117 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 07:33:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:33.767 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[72487886-0c0d-4923-9f5e-39c8376c012d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:33:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:33.770 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[1226056b-fb83-40cd-808e-a0be07a8af8b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:33:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:33.811 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[9540b1f0-a709-4e6c-950e-e25bbe681e1c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:33:33 compute-0 kernel: tap3ba5e068-43: entered promiscuous mode
Sep 30 07:33:33 compute-0 NetworkManager[51813]: <info>  [1759217613.8143] manager: (tap3ba5e068-43): new Tun device (/org/freedesktop/NetworkManager/Devices/69)
Sep 30 07:33:33 compute-0 systemd-udevd[220150]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 07:33:33 compute-0 kernel: tap3ba5e068-43 (unregistering): left promiscuous mode
Sep 30 07:33:33 compute-0 nova_compute[189265]: 2025-09-30 07:33:33.821 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:33 compute-0 ovn_controller[91436]: 2025-09-30T07:33:33Z|00193|binding|INFO|Claiming lport 3ba5e068-43ce-405c-886f-070951e83cf3 for this chassis.
Sep 30 07:33:33 compute-0 ovn_controller[91436]: 2025-09-30T07:33:33Z|00194|binding|INFO|3ba5e068-43ce-405c-886f-070951e83cf3: Claiming fa:16:3e:11:0c:b9 10.100.0.5
Sep 30 07:33:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:33.832 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:11:0c:b9 10.100.0.5'], port_security=['fa:16:3e:11:0c:b9 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '3dbea315-3898-49bb-843e-b31c235e99e0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c99c822b-3191-49e5-b938-903e25b4a9bb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6431607f3dce4c88bbf6d17ee6cd45b2', 'neutron:revision_number': '5', 'neutron:security_group_ids': '39e9818d-6ede-4a3d-b6e2-a5ad3a4c803a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0bbcb02d-e040-4e0e-9a60-6466c4420133, chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], logical_port=3ba5e068-43ce-405c-886f-070951e83cf3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:33:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:33.833 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[4a73efbf-ae4a-482f-b4fb-508309a0cd35]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc99c822b-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:67:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 46], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 542265, 'reachable_time': 25083, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220176, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:33:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:33.848 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[dbcfe387-5c05-4b88-9626-cdff72ece5af]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapc99c822b-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 542282, 'tstamp': 542282}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220181, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapc99c822b-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 542287, 'tstamp': 542287}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220181, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:33:33 compute-0 ovn_controller[91436]: 2025-09-30T07:33:33Z|00195|binding|INFO|Releasing lport 3ba5e068-43ce-405c-886f-070951e83cf3 from this chassis (sb_readonly=0)
Sep 30 07:33:33 compute-0 nova_compute[189265]: 2025-09-30 07:33:33.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:33.854 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc99c822b-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:33:33 compute-0 nova_compute[189265]: 2025-09-30 07:33:33.855 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:33 compute-0 nova_compute[189265]: 2025-09-30 07:33:33.859 2 INFO nova.virt.libvirt.driver [-] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Instance destroyed successfully.
Sep 30 07:33:33 compute-0 nova_compute[189265]: 2025-09-30 07:33:33.860 2 DEBUG nova.objects.instance [None req-9a8953a7-bbe2-438f-8c1a-5603da129777 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lazy-loading 'resources' on Instance uuid 3dbea315-3898-49bb-843e-b31c235e99e0 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:33:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:33.860 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:11:0c:b9 10.100.0.5'], port_security=['fa:16:3e:11:0c:b9 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '3dbea315-3898-49bb-843e-b31c235e99e0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c99c822b-3191-49e5-b938-903e25b4a9bb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6431607f3dce4c88bbf6d17ee6cd45b2', 'neutron:revision_number': '5', 'neutron:security_group_ids': '39e9818d-6ede-4a3d-b6e2-a5ad3a4c803a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0bbcb02d-e040-4e0e-9a60-6466c4420133, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], logical_port=3ba5e068-43ce-405c-886f-070951e83cf3) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:33:33 compute-0 nova_compute[189265]: 2025-09-30 07:33:33.875 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:33.875 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc99c822b-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:33:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:33.876 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:33:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:33.876 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc99c822b-30, col_values=(('external_ids', {'iface-id': '67b7df48-3f38-444a-8506-1c0ec5bd1d15'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:33:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:33.877 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:33:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:33.878 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[cd924523-c0fb-473c-a52e-fd4196406feb]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-c99c822b-3191-49e5-b938-903e25b4a9bb\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID c99c822b-3191-49e5-b938-903e25b4a9bb\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:33:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:33.879 100322 INFO neutron.agent.ovn.metadata.agent [-] Port 3ba5e068-43ce-405c-886f-070951e83cf3 in datapath c99c822b-3191-49e5-b938-903e25b4a9bb unbound from our chassis
Sep 30 07:33:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:33.880 100322 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c99c822b-3191-49e5-b938-903e25b4a9bb
Sep 30 07:33:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:33.891 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[c8d38e7e-6d6f-4c2a-84fc-447ff6a2b2c2]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:33:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:33.917 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[a8404a61-e1aa-4243-a4e3-b0e9f717bbf8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:33:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:33.919 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[68eab2a1-0603-43c5-8e4b-ad7ce96ed7a2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:33:33 compute-0 nova_compute[189265]: 2025-09-30 07:33:33.928 2 DEBUG nova.compute.manager [req-0d09fb1c-9bb4-4db4-8787-d7e9d70c45ff req-c571e939-c56a-4f62-92ba-4f90aacdd3e7 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Received event network-vif-unplugged-3ba5e068-43ce-405c-886f-070951e83cf3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:33:33 compute-0 nova_compute[189265]: 2025-09-30 07:33:33.929 2 DEBUG oslo_concurrency.lockutils [req-0d09fb1c-9bb4-4db4-8787-d7e9d70c45ff req-c571e939-c56a-4f62-92ba-4f90aacdd3e7 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "3dbea315-3898-49bb-843e-b31c235e99e0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:33:33 compute-0 nova_compute[189265]: 2025-09-30 07:33:33.929 2 DEBUG oslo_concurrency.lockutils [req-0d09fb1c-9bb4-4db4-8787-d7e9d70c45ff req-c571e939-c56a-4f62-92ba-4f90aacdd3e7 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "3dbea315-3898-49bb-843e-b31c235e99e0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:33:33 compute-0 nova_compute[189265]: 2025-09-30 07:33:33.930 2 DEBUG oslo_concurrency.lockutils [req-0d09fb1c-9bb4-4db4-8787-d7e9d70c45ff req-c571e939-c56a-4f62-92ba-4f90aacdd3e7 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "3dbea315-3898-49bb-843e-b31c235e99e0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:33:33 compute-0 nova_compute[189265]: 2025-09-30 07:33:33.930 2 DEBUG nova.compute.manager [req-0d09fb1c-9bb4-4db4-8787-d7e9d70c45ff req-c571e939-c56a-4f62-92ba-4f90aacdd3e7 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] No waiting events found dispatching network-vif-unplugged-3ba5e068-43ce-405c-886f-070951e83cf3 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:33:33 compute-0 nova_compute[189265]: 2025-09-30 07:33:33.930 2 DEBUG nova.compute.manager [req-0d09fb1c-9bb4-4db4-8787-d7e9d70c45ff req-c571e939-c56a-4f62-92ba-4f90aacdd3e7 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Received event network-vif-unplugged-3ba5e068-43ce-405c-886f-070951e83cf3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:33:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:33.944 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[5b7da263-007c-4f7a-a7e4-d75630c7bc2b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:33:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:33.960 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[63eb83f1-5263-4ef9-920a-0b06abb6ba11]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc99c822b-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:67:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 30, 'tx_packets': 9, 'rx_bytes': 1756, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 30, 'tx_packets': 9, 'rx_bytes': 1756, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 46], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 542265, 'reachable_time': 25083, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220191, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:33:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:33.972 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[92a634fc-b429-4368-ad91-1554fceb95ad]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapc99c822b-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 542282, 'tstamp': 542282}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220192, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapc99c822b-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 542287, 'tstamp': 542287}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220192, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:33:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:33.973 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc99c822b-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:33:33 compute-0 nova_compute[189265]: 2025-09-30 07:33:33.975 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:33 compute-0 nova_compute[189265]: 2025-09-30 07:33:33.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:33.982 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc99c822b-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:33:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:33.982 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:33:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:33.982 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc99c822b-30, col_values=(('external_ids', {'iface-id': '67b7df48-3f38-444a-8506-1c0ec5bd1d15'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:33:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:33.983 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:33:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:33.984 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[659dfeb7-e56f-43e9-93d3-c9bb887fb8ac]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-c99c822b-3191-49e5-b938-903e25b4a9bb\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID c99c822b-3191-49e5-b938-903e25b4a9bb\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:33:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:33.986 100322 INFO neutron.agent.ovn.metadata.agent [-] Port 3ba5e068-43ce-405c-886f-070951e83cf3 in datapath c99c822b-3191-49e5-b938-903e25b4a9bb unbound from our chassis
Sep 30 07:33:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:33.987 100322 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c99c822b-3191-49e5-b938-903e25b4a9bb
Sep 30 07:33:34 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:34.001 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[117e470e-f121-4201-9148-2d3216472c4a]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:33:34 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:34.038 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[c2b0d6a4-275d-44a7-aac3-6a8a562b3208]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:33:34 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:34.041 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[a7a25073-762e-4fbd-a998-5b3a2fc49411]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:33:34 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:34.076 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[c9eb6704-a416-4b12-bc8a-deeda31c5f67]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:33:34 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:34.104 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[6589ff76-3df8-4506-babd-b22c300bfebb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc99c822b-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:67:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 30, 'tx_packets': 11, 'rx_bytes': 1756, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 30, 'tx_packets': 11, 'rx_bytes': 1756, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 46], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 542265, 'reachable_time': 25083, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220199, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:33:34 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:34.131 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[aa5a70f0-0ecd-4547-a4ec-e13b9f28f437]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapc99c822b-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 542282, 'tstamp': 542282}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220200, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapc99c822b-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 542287, 'tstamp': 542287}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220200, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:33:34 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:34.132 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc99c822b-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:33:34 compute-0 nova_compute[189265]: 2025-09-30 07:33:34.134 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:34 compute-0 nova_compute[189265]: 2025-09-30 07:33:34.139 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:34 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:34.139 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc99c822b-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:33:34 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:34.139 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:33:34 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:34.140 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc99c822b-30, col_values=(('external_ids', {'iface-id': '67b7df48-3f38-444a-8506-1c0ec5bd1d15'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:33:34 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:34.140 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:33:34 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:34.142 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[acd4e714-db08-43b7-967c-47eea404c7c7]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-c99c822b-3191-49e5-b938-903e25b4a9bb\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID c99c822b-3191-49e5-b938-903e25b4a9bb\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:33:34 compute-0 nova_compute[189265]: 2025-09-30 07:33:34.366 2 DEBUG nova.virt.libvirt.vif [None req-9a8953a7-bbe2-438f-8c1a-5603da129777 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-09-30T07:32:31Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1531325207',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1531325207',id=19,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T07:32:47Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6431607f3dce4c88bbf6d17ee6cd45b2',ramdisk_id='',reservation_id='r-kqrx0v8v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1096120513',owner_user_name='tempest-TestExecuteStrategies-1096120513-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T07:32:47Z,user_data=None,user_id='89ba5d19014145188ad2a3c812acdc88',uuid=3dbea315-3898-49bb-843e-b31c235e99e0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3ba5e068-43ce-405c-886f-070951e83cf3", "address": "fa:16:3e:11:0c:b9", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ba5e068-43", "ovs_interfaceid": "3ba5e068-43ce-405c-886f-070951e83cf3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 07:33:34 compute-0 nova_compute[189265]: 2025-09-30 07:33:34.366 2 DEBUG nova.network.os_vif_util [None req-9a8953a7-bbe2-438f-8c1a-5603da129777 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Converting VIF {"id": "3ba5e068-43ce-405c-886f-070951e83cf3", "address": "fa:16:3e:11:0c:b9", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ba5e068-43", "ovs_interfaceid": "3ba5e068-43ce-405c-886f-070951e83cf3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:33:34 compute-0 nova_compute[189265]: 2025-09-30 07:33:34.367 2 DEBUG nova.network.os_vif_util [None req-9a8953a7-bbe2-438f-8c1a-5603da129777 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:11:0c:b9,bridge_name='br-int',has_traffic_filtering=True,id=3ba5e068-43ce-405c-886f-070951e83cf3,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ba5e068-43') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:33:34 compute-0 nova_compute[189265]: 2025-09-30 07:33:34.369 2 DEBUG os_vif [None req-9a8953a7-bbe2-438f-8c1a-5603da129777 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:11:0c:b9,bridge_name='br-int',has_traffic_filtering=True,id=3ba5e068-43ce-405c-886f-070951e83cf3,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ba5e068-43') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 07:33:34 compute-0 nova_compute[189265]: 2025-09-30 07:33:34.370 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:34 compute-0 nova_compute[189265]: 2025-09-30 07:33:34.370 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3ba5e068-43, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:33:34 compute-0 nova_compute[189265]: 2025-09-30 07:33:34.371 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:34 compute-0 nova_compute[189265]: 2025-09-30 07:33:34.373 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:33:34 compute-0 nova_compute[189265]: 2025-09-30 07:33:34.373 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:34 compute-0 nova_compute[189265]: 2025-09-30 07:33:34.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:34 compute-0 nova_compute[189265]: 2025-09-30 07:33:34.374 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=b9f0f4ed-bb7f-4651-a578-1ae2ecf06b95) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:33:34 compute-0 nova_compute[189265]: 2025-09-30 07:33:34.375 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:34 compute-0 nova_compute[189265]: 2025-09-30 07:33:34.377 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:34 compute-0 nova_compute[189265]: 2025-09-30 07:33:34.379 2 INFO os_vif [None req-9a8953a7-bbe2-438f-8c1a-5603da129777 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:11:0c:b9,bridge_name='br-int',has_traffic_filtering=True,id=3ba5e068-43ce-405c-886f-070951e83cf3,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ba5e068-43')
Sep 30 07:33:34 compute-0 nova_compute[189265]: 2025-09-30 07:33:34.380 2 INFO nova.virt.libvirt.driver [None req-9a8953a7-bbe2-438f-8c1a-5603da129777 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Deleting instance files /var/lib/nova/instances/3dbea315-3898-49bb-843e-b31c235e99e0_del
Sep 30 07:33:34 compute-0 nova_compute[189265]: 2025-09-30 07:33:34.380 2 INFO nova.virt.libvirt.driver [None req-9a8953a7-bbe2-438f-8c1a-5603da129777 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Deletion of /var/lib/nova/instances/3dbea315-3898-49bb-843e-b31c235e99e0_del complete
Sep 30 07:33:34 compute-0 nova_compute[189265]: 2025-09-30 07:33:34.895 2 INFO nova.compute.manager [None req-9a8953a7-bbe2-438f-8c1a-5603da129777 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Took 1.30 seconds to destroy the instance on the hypervisor.
Sep 30 07:33:34 compute-0 nova_compute[189265]: 2025-09-30 07:33:34.896 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-9a8953a7-bbe2-438f-8c1a-5603da129777 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 07:33:34 compute-0 nova_compute[189265]: 2025-09-30 07:33:34.897 2 DEBUG nova.compute.manager [-] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 07:33:34 compute-0 nova_compute[189265]: 2025-09-30 07:33:34.897 2 DEBUG nova.network.neutron [-] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 07:33:34 compute-0 nova_compute[189265]: 2025-09-30 07:33:34.897 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:33:35 compute-0 nova_compute[189265]: 2025-09-30 07:33:35.806 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:33:35 compute-0 nova_compute[189265]: 2025-09-30 07:33:35.980 2 DEBUG nova.compute.manager [req-eb801065-952a-4bcb-8d3e-266df3b7707a req-1b74db4a-ad8d-4166-878a-782cd3b7573b 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Received event network-vif-unplugged-3ba5e068-43ce-405c-886f-070951e83cf3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:33:35 compute-0 nova_compute[189265]: 2025-09-30 07:33:35.981 2 DEBUG oslo_concurrency.lockutils [req-eb801065-952a-4bcb-8d3e-266df3b7707a req-1b74db4a-ad8d-4166-878a-782cd3b7573b 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "3dbea315-3898-49bb-843e-b31c235e99e0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:33:35 compute-0 nova_compute[189265]: 2025-09-30 07:33:35.981 2 DEBUG oslo_concurrency.lockutils [req-eb801065-952a-4bcb-8d3e-266df3b7707a req-1b74db4a-ad8d-4166-878a-782cd3b7573b 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "3dbea315-3898-49bb-843e-b31c235e99e0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:33:35 compute-0 nova_compute[189265]: 2025-09-30 07:33:35.981 2 DEBUG oslo_concurrency.lockutils [req-eb801065-952a-4bcb-8d3e-266df3b7707a req-1b74db4a-ad8d-4166-878a-782cd3b7573b 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "3dbea315-3898-49bb-843e-b31c235e99e0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:33:35 compute-0 nova_compute[189265]: 2025-09-30 07:33:35.982 2 DEBUG nova.compute.manager [req-eb801065-952a-4bcb-8d3e-266df3b7707a req-1b74db4a-ad8d-4166-878a-782cd3b7573b 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] No waiting events found dispatching network-vif-unplugged-3ba5e068-43ce-405c-886f-070951e83cf3 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:33:35 compute-0 nova_compute[189265]: 2025-09-30 07:33:35.982 2 DEBUG nova.compute.manager [req-eb801065-952a-4bcb-8d3e-266df3b7707a req-1b74db4a-ad8d-4166-878a-782cd3b7573b 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Received event network-vif-unplugged-3ba5e068-43ce-405c-886f-070951e83cf3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:33:37 compute-0 nova_compute[189265]: 2025-09-30 07:33:37.295 2 DEBUG nova.network.neutron [-] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:33:37 compute-0 nova_compute[189265]: 2025-09-30 07:33:37.803 2 INFO nova.compute.manager [-] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Took 2.91 seconds to deallocate network for instance.
Sep 30 07:33:38 compute-0 nova_compute[189265]: 2025-09-30 07:33:38.039 2 DEBUG nova.compute.manager [req-3ca05da5-6a6f-4b3d-84d6-52849564cf4e req-3de0aa4b-9846-4eb5-b32c-ab3648153414 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3dbea315-3898-49bb-843e-b31c235e99e0] Received event network-vif-deleted-3ba5e068-43ce-405c-886f-070951e83cf3 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:33:38 compute-0 nova_compute[189265]: 2025-09-30 07:33:38.326 2 DEBUG oslo_concurrency.lockutils [None req-9a8953a7-bbe2-438f-8c1a-5603da129777 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:33:38 compute-0 nova_compute[189265]: 2025-09-30 07:33:38.327 2 DEBUG oslo_concurrency.lockutils [None req-9a8953a7-bbe2-438f-8c1a-5603da129777 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:33:38 compute-0 nova_compute[189265]: 2025-09-30 07:33:38.408 2 DEBUG nova.compute.provider_tree [None req-9a8953a7-bbe2-438f-8c1a-5603da129777 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:33:38 compute-0 nova_compute[189265]: 2025-09-30 07:33:38.722 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:38 compute-0 nova_compute[189265]: 2025-09-30 07:33:38.917 2 DEBUG nova.scheduler.client.report [None req-9a8953a7-bbe2-438f-8c1a-5603da129777 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:33:39 compute-0 nova_compute[189265]: 2025-09-30 07:33:39.376 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:39 compute-0 nova_compute[189265]: 2025-09-30 07:33:39.429 2 DEBUG oslo_concurrency.lockutils [None req-9a8953a7-bbe2-438f-8c1a-5603da129777 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.102s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:33:39 compute-0 nova_compute[189265]: 2025-09-30 07:33:39.454 2 INFO nova.scheduler.client.report [None req-9a8953a7-bbe2-438f-8c1a-5603da129777 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Deleted allocations for instance 3dbea315-3898-49bb-843e-b31c235e99e0
Sep 30 07:33:40 compute-0 nova_compute[189265]: 2025-09-30 07:33:40.487 2 DEBUG oslo_concurrency.lockutils [None req-9a8953a7-bbe2-438f-8c1a-5603da129777 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "3dbea315-3898-49bb-843e-b31c235e99e0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.431s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:33:41 compute-0 nova_compute[189265]: 2025-09-30 07:33:41.955 2 DEBUG oslo_concurrency.lockutils [None req-9b210b13-ad96-421c-aa3e-2aec378fddb9 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquiring lock "f680ae8a-3adb-4298-84c6-cae58224d553" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:33:41 compute-0 nova_compute[189265]: 2025-09-30 07:33:41.956 2 DEBUG oslo_concurrency.lockutils [None req-9b210b13-ad96-421c-aa3e-2aec378fddb9 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "f680ae8a-3adb-4298-84c6-cae58224d553" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:33:41 compute-0 nova_compute[189265]: 2025-09-30 07:33:41.956 2 DEBUG oslo_concurrency.lockutils [None req-9b210b13-ad96-421c-aa3e-2aec378fddb9 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquiring lock "f680ae8a-3adb-4298-84c6-cae58224d553-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:33:41 compute-0 nova_compute[189265]: 2025-09-30 07:33:41.956 2 DEBUG oslo_concurrency.lockutils [None req-9b210b13-ad96-421c-aa3e-2aec378fddb9 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "f680ae8a-3adb-4298-84c6-cae58224d553-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:33:41 compute-0 nova_compute[189265]: 2025-09-30 07:33:41.957 2 DEBUG oslo_concurrency.lockutils [None req-9b210b13-ad96-421c-aa3e-2aec378fddb9 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "f680ae8a-3adb-4298-84c6-cae58224d553-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:33:41 compute-0 nova_compute[189265]: 2025-09-30 07:33:41.975 2 INFO nova.compute.manager [None req-9b210b13-ad96-421c-aa3e-2aec378fddb9 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: f680ae8a-3adb-4298-84c6-cae58224d553] Terminating instance
Sep 30 07:33:42 compute-0 nova_compute[189265]: 2025-09-30 07:33:42.494 2 DEBUG nova.compute.manager [None req-9b210b13-ad96-421c-aa3e-2aec378fddb9 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: f680ae8a-3adb-4298-84c6-cae58224d553] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 07:33:42 compute-0 kernel: tap28b88230-ad (unregistering): left promiscuous mode
Sep 30 07:33:42 compute-0 NetworkManager[51813]: <info>  [1759217622.5235] device (tap28b88230-ad): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 07:33:42 compute-0 nova_compute[189265]: 2025-09-30 07:33:42.565 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:42 compute-0 ovn_controller[91436]: 2025-09-30T07:33:42Z|00196|binding|INFO|Releasing lport 28b88230-ad9c-48dd-a487-28043768f2c6 from this chassis (sb_readonly=0)
Sep 30 07:33:42 compute-0 ovn_controller[91436]: 2025-09-30T07:33:42Z|00197|binding|INFO|Setting lport 28b88230-ad9c-48dd-a487-28043768f2c6 down in Southbound
Sep 30 07:33:42 compute-0 ovn_controller[91436]: 2025-09-30T07:33:42Z|00198|binding|INFO|Removing iface tap28b88230-ad ovn-installed in OVS
Sep 30 07:33:42 compute-0 nova_compute[189265]: 2025-09-30 07:33:42.568 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:42.573 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:23:30:dc 10.100.0.10'], port_security=['fa:16:3e:23:30:dc 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'f680ae8a-3adb-4298-84c6-cae58224d553', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c99c822b-3191-49e5-b938-903e25b4a9bb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6431607f3dce4c88bbf6d17ee6cd45b2', 'neutron:revision_number': '16', 'neutron:security_group_ids': '39e9818d-6ede-4a3d-b6e2-a5ad3a4c803a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0bbcb02d-e040-4e0e-9a60-6466c4420133, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], logical_port=28b88230-ad9c-48dd-a487-28043768f2c6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:33:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:42.573 100322 INFO neutron.agent.ovn.metadata.agent [-] Port 28b88230-ad9c-48dd-a487-28043768f2c6 in datapath c99c822b-3191-49e5-b938-903e25b4a9bb unbound from our chassis
Sep 30 07:33:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:42.577 100322 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c99c822b-3191-49e5-b938-903e25b4a9bb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 07:33:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:42.578 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[a3e9dc09-a066-4120-a884-c306a9830faf]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:33:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:42.578 100322 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb namespace which is not needed anymore
Sep 30 07:33:42 compute-0 nova_compute[189265]: 2025-09-30 07:33:42.594 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:42 compute-0 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000012.scope: Deactivated successfully.
Sep 30 07:33:42 compute-0 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d00000012.scope: Consumed 2.623s CPU time.
Sep 30 07:33:42 compute-0 systemd-machined[149233]: Machine qemu-15-instance-00000012 terminated.
Sep 30 07:33:42 compute-0 podman[220202]: 2025-09-30 07:33:42.636029347 +0000 UTC m=+0.091541508 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 07:33:42 compute-0 nova_compute[189265]: 2025-09-30 07:33:42.725 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:42 compute-0 nova_compute[189265]: 2025-09-30 07:33:42.735 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:42 compute-0 neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb[219838]: [NOTICE]   (219842) : haproxy version is 3.0.5-8e879a5
Sep 30 07:33:42 compute-0 neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb[219838]: [NOTICE]   (219842) : path to executable is /usr/sbin/haproxy
Sep 30 07:33:42 compute-0 neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb[219838]: [WARNING]  (219842) : Exiting Master process...
Sep 30 07:33:42 compute-0 podman[220244]: 2025-09-30 07:33:42.752887182 +0000 UTC m=+0.049270360 container kill cafba4cf241096d989f2f80e0ad2b4f4549b49d8710d6344cfbc604109bf66f1 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0)
Sep 30 07:33:42 compute-0 systemd[1]: libpod-cafba4cf241096d989f2f80e0ad2b4f4549b49d8710d6344cfbc604109bf66f1.scope: Deactivated successfully.
Sep 30 07:33:42 compute-0 neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb[219838]: [ALERT]    (219842) : Current worker (219844) exited with code 143 (Terminated)
Sep 30 07:33:42 compute-0 neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb[219838]: [WARNING]  (219842) : All workers exited. Exiting... (0)
Sep 30 07:33:42 compute-0 conmon[219838]: conmon cafba4cf241096d989f2 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-cafba4cf241096d989f2f80e0ad2b4f4549b49d8710d6344cfbc604109bf66f1.scope/container/memory.events
Sep 30 07:33:42 compute-0 nova_compute[189265]: 2025-09-30 07:33:42.783 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:33:42 compute-0 nova_compute[189265]: 2025-09-30 07:33:42.785 2 INFO nova.virt.libvirt.driver [-] [instance: f680ae8a-3adb-4298-84c6-cae58224d553] Instance destroyed successfully.
Sep 30 07:33:42 compute-0 nova_compute[189265]: 2025-09-30 07:33:42.785 2 DEBUG nova.objects.instance [None req-9b210b13-ad96-421c-aa3e-2aec378fddb9 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lazy-loading 'resources' on Instance uuid f680ae8a-3adb-4298-84c6-cae58224d553 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:33:42 compute-0 podman[220268]: 2025-09-30 07:33:42.806360692 +0000 UTC m=+0.029813270 container died cafba4cf241096d989f2f80e0ad2b4f4549b49d8710d6344cfbc604109bf66f1 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Sep 30 07:33:42 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cafba4cf241096d989f2f80e0ad2b4f4549b49d8710d6344cfbc604109bf66f1-userdata-shm.mount: Deactivated successfully.
Sep 30 07:33:42 compute-0 systemd[1]: var-lib-containers-storage-overlay-c5b858e933b595d09be88bc21d91acdaac0facdc0d72ff7dc7b261bc58d2ca8e-merged.mount: Deactivated successfully.
Sep 30 07:33:42 compute-0 podman[220268]: 2025-09-30 07:33:42.862037265 +0000 UTC m=+0.085489763 container cleanup cafba4cf241096d989f2f80e0ad2b4f4549b49d8710d6344cfbc604109bf66f1 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Sep 30 07:33:42 compute-0 systemd[1]: libpod-conmon-cafba4cf241096d989f2f80e0ad2b4f4549b49d8710d6344cfbc604109bf66f1.scope: Deactivated successfully.
Sep 30 07:33:42 compute-0 podman[220280]: 2025-09-30 07:33:42.884892414 +0000 UTC m=+0.088246593 container remove cafba4cf241096d989f2f80e0ad2b4f4549b49d8710d6344cfbc604109bf66f1 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest)
Sep 30 07:33:42 compute-0 nova_compute[189265]: 2025-09-30 07:33:42.891 2 DEBUG nova.compute.manager [req-7f983f61-4f3a-419b-8d04-86f58a606b5a req-a7a0485f-09af-40b6-b9e2-7cabe44dab3e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: f680ae8a-3adb-4298-84c6-cae58224d553] Received event network-vif-unplugged-28b88230-ad9c-48dd-a487-28043768f2c6 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:33:42 compute-0 nova_compute[189265]: 2025-09-30 07:33:42.892 2 DEBUG oslo_concurrency.lockutils [req-7f983f61-4f3a-419b-8d04-86f58a606b5a req-a7a0485f-09af-40b6-b9e2-7cabe44dab3e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "f680ae8a-3adb-4298-84c6-cae58224d553-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:33:42 compute-0 nova_compute[189265]: 2025-09-30 07:33:42.892 2 DEBUG oslo_concurrency.lockutils [req-7f983f61-4f3a-419b-8d04-86f58a606b5a req-a7a0485f-09af-40b6-b9e2-7cabe44dab3e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "f680ae8a-3adb-4298-84c6-cae58224d553-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:33:42 compute-0 nova_compute[189265]: 2025-09-30 07:33:42.892 2 DEBUG oslo_concurrency.lockutils [req-7f983f61-4f3a-419b-8d04-86f58a606b5a req-a7a0485f-09af-40b6-b9e2-7cabe44dab3e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "f680ae8a-3adb-4298-84c6-cae58224d553-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:33:42 compute-0 nova_compute[189265]: 2025-09-30 07:33:42.892 2 DEBUG nova.compute.manager [req-7f983f61-4f3a-419b-8d04-86f58a606b5a req-a7a0485f-09af-40b6-b9e2-7cabe44dab3e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: f680ae8a-3adb-4298-84c6-cae58224d553] No waiting events found dispatching network-vif-unplugged-28b88230-ad9c-48dd-a487-28043768f2c6 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:33:42 compute-0 nova_compute[189265]: 2025-09-30 07:33:42.893 2 DEBUG nova.compute.manager [req-7f983f61-4f3a-419b-8d04-86f58a606b5a req-a7a0485f-09af-40b6-b9e2-7cabe44dab3e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: f680ae8a-3adb-4298-84c6-cae58224d553] Received event network-vif-unplugged-28b88230-ad9c-48dd-a487-28043768f2c6 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:33:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:42.896 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[f3f25586-50f9-4a54-adc1-1df7bb0ed826]: (4, ("Tue Sep 30 07:33:42 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb (cafba4cf241096d989f2f80e0ad2b4f4549b49d8710d6344cfbc604109bf66f1)\ncafba4cf241096d989f2f80e0ad2b4f4549b49d8710d6344cfbc604109bf66f1\nTue Sep 30 07:33:42 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb (cafba4cf241096d989f2f80e0ad2b4f4549b49d8710d6344cfbc604109bf66f1)\ncafba4cf241096d989f2f80e0ad2b4f4549b49d8710d6344cfbc604109bf66f1\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:33:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:42.897 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[0298f164-7ace-471a-88e5-190129682c9f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:33:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:42.898 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:33:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:42.898 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[086cfb20-60e0-4dec-bc83-1d26f289568f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:33:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:42.899 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc99c822b-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:33:42 compute-0 nova_compute[189265]: 2025-09-30 07:33:42.901 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:42 compute-0 kernel: tapc99c822b-30: left promiscuous mode
Sep 30 07:33:42 compute-0 nova_compute[189265]: 2025-09-30 07:33:42.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:42.918 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[18eb2413-ab0b-4ede-ae90-8a8c165d60b8]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:33:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:42.942 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[1b456427-6765-4994-93f7-9a3fbcb899d1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:33:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:42.944 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[6a369748-0591-4416-a5a1-9b0e6b7dcc00]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:33:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:42.961 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[70ff7676-f9f1-4bda-b31a-80e71358a0c3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 542257, 'reachable_time': 44941, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220305, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:33:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:42.963 100440 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Sep 30 07:33:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:33:42.964 100440 DEBUG oslo.privsep.daemon [-] privsep: reply[1a0be200-0c7a-4669-bd08-8101cb5d3ea4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:33:42 compute-0 systemd[1]: run-netns-ovnmeta\x2dc99c822b\x2d3191\x2d49e5\x2db938\x2d903e25b4a9bb.mount: Deactivated successfully.
Sep 30 07:33:43 compute-0 nova_compute[189265]: 2025-09-30 07:33:43.295 2 DEBUG nova.virt.libvirt.vif [None req-9b210b13-ad96-421c-aa3e-2aec378fddb9 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-09-30T07:32:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1562644846',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1562644846',id=18,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T07:32:27Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6431607f3dce4c88bbf6d17ee6cd45b2',ramdisk_id='',reservation_id='r-szxf7qui',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',clean_attempts='1',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1096120513',owner_user_name='tempest-TestExecuteStrategies-1096120513-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T07:33:28Z,user_data=None,user_id='89ba5d19014145188ad2a3c812acdc88',uuid=f680ae8a-3adb-4298-84c6-cae58224d553,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "28b88230-ad9c-48dd-a487-28043768f2c6", "address": "fa:16:3e:23:30:dc", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap28b88230-ad", "ovs_interfaceid": "28b88230-ad9c-48dd-a487-28043768f2c6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 07:33:43 compute-0 nova_compute[189265]: 2025-09-30 07:33:43.296 2 DEBUG nova.network.os_vif_util [None req-9b210b13-ad96-421c-aa3e-2aec378fddb9 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Converting VIF {"id": "28b88230-ad9c-48dd-a487-28043768f2c6", "address": "fa:16:3e:23:30:dc", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap28b88230-ad", "ovs_interfaceid": "28b88230-ad9c-48dd-a487-28043768f2c6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:33:43 compute-0 nova_compute[189265]: 2025-09-30 07:33:43.297 2 DEBUG nova.network.os_vif_util [None req-9b210b13-ad96-421c-aa3e-2aec378fddb9 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:23:30:dc,bridge_name='br-int',has_traffic_filtering=True,id=28b88230-ad9c-48dd-a487-28043768f2c6,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap28b88230-ad') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:33:43 compute-0 nova_compute[189265]: 2025-09-30 07:33:43.298 2 DEBUG os_vif [None req-9b210b13-ad96-421c-aa3e-2aec378fddb9 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:23:30:dc,bridge_name='br-int',has_traffic_filtering=True,id=28b88230-ad9c-48dd-a487-28043768f2c6,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap28b88230-ad') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 07:33:43 compute-0 nova_compute[189265]: 2025-09-30 07:33:43.300 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:43 compute-0 nova_compute[189265]: 2025-09-30 07:33:43.300 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap28b88230-ad, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:33:43 compute-0 nova_compute[189265]: 2025-09-30 07:33:43.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:33:43 compute-0 nova_compute[189265]: 2025-09-30 07:33:43.307 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:43 compute-0 nova_compute[189265]: 2025-09-30 07:33:43.307 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=6c73f78f-f580-4986-b80a-6eabfe3a5b4a) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:33:43 compute-0 nova_compute[189265]: 2025-09-30 07:33:43.311 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:33:43 compute-0 nova_compute[189265]: 2025-09-30 07:33:43.313 2 INFO os_vif [None req-9b210b13-ad96-421c-aa3e-2aec378fddb9 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:23:30:dc,bridge_name='br-int',has_traffic_filtering=True,id=28b88230-ad9c-48dd-a487-28043768f2c6,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap28b88230-ad')
Sep 30 07:33:43 compute-0 nova_compute[189265]: 2025-09-30 07:33:43.314 2 INFO nova.virt.libvirt.driver [None req-9b210b13-ad96-421c-aa3e-2aec378fddb9 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: f680ae8a-3adb-4298-84c6-cae58224d553] Deleting instance files /var/lib/nova/instances/f680ae8a-3adb-4298-84c6-cae58224d553_del
Sep 30 07:33:43 compute-0 nova_compute[189265]: 2025-09-30 07:33:43.315 2 INFO nova.virt.libvirt.driver [None req-9b210b13-ad96-421c-aa3e-2aec378fddb9 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: f680ae8a-3adb-4298-84c6-cae58224d553] Deletion of /var/lib/nova/instances/f680ae8a-3adb-4298-84c6-cae58224d553_del complete
Sep 30 07:33:43 compute-0 nova_compute[189265]: 2025-09-30 07:33:43.725 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:43 compute-0 nova_compute[189265]: 2025-09-30 07:33:43.832 2 INFO nova.compute.manager [None req-9b210b13-ad96-421c-aa3e-2aec378fddb9 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: f680ae8a-3adb-4298-84c6-cae58224d553] Took 1.34 seconds to destroy the instance on the hypervisor.
Sep 30 07:33:43 compute-0 nova_compute[189265]: 2025-09-30 07:33:43.832 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-9b210b13-ad96-421c-aa3e-2aec378fddb9 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 07:33:43 compute-0 nova_compute[189265]: 2025-09-30 07:33:43.833 2 DEBUG nova.compute.manager [-] [instance: f680ae8a-3adb-4298-84c6-cae58224d553] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 07:33:43 compute-0 nova_compute[189265]: 2025-09-30 07:33:43.834 2 DEBUG nova.network.neutron [-] [instance: f680ae8a-3adb-4298-84c6-cae58224d553] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 07:33:43 compute-0 nova_compute[189265]: 2025-09-30 07:33:43.834 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:33:44 compute-0 nova_compute[189265]: 2025-09-30 07:33:44.295 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:33:44 compute-0 nova_compute[189265]: 2025-09-30 07:33:44.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:33:44 compute-0 nova_compute[189265]: 2025-09-30 07:33:44.949 2 DEBUG nova.compute.manager [req-9aff6627-2b6c-4abc-8a27-df5520441f44 req-c1798da6-7d71-4831-889a-ac056f19cdb2 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: f680ae8a-3adb-4298-84c6-cae58224d553] Received event network-vif-unplugged-28b88230-ad9c-48dd-a487-28043768f2c6 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:33:44 compute-0 nova_compute[189265]: 2025-09-30 07:33:44.950 2 DEBUG oslo_concurrency.lockutils [req-9aff6627-2b6c-4abc-8a27-df5520441f44 req-c1798da6-7d71-4831-889a-ac056f19cdb2 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "f680ae8a-3adb-4298-84c6-cae58224d553-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:33:44 compute-0 nova_compute[189265]: 2025-09-30 07:33:44.950 2 DEBUG oslo_concurrency.lockutils [req-9aff6627-2b6c-4abc-8a27-df5520441f44 req-c1798da6-7d71-4831-889a-ac056f19cdb2 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "f680ae8a-3adb-4298-84c6-cae58224d553-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:33:44 compute-0 nova_compute[189265]: 2025-09-30 07:33:44.951 2 DEBUG oslo_concurrency.lockutils [req-9aff6627-2b6c-4abc-8a27-df5520441f44 req-c1798da6-7d71-4831-889a-ac056f19cdb2 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "f680ae8a-3adb-4298-84c6-cae58224d553-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:33:44 compute-0 nova_compute[189265]: 2025-09-30 07:33:44.951 2 DEBUG nova.compute.manager [req-9aff6627-2b6c-4abc-8a27-df5520441f44 req-c1798da6-7d71-4831-889a-ac056f19cdb2 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: f680ae8a-3adb-4298-84c6-cae58224d553] No waiting events found dispatching network-vif-unplugged-28b88230-ad9c-48dd-a487-28043768f2c6 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:33:44 compute-0 nova_compute[189265]: 2025-09-30 07:33:44.952 2 DEBUG nova.compute.manager [req-9aff6627-2b6c-4abc-8a27-df5520441f44 req-c1798da6-7d71-4831-889a-ac056f19cdb2 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: f680ae8a-3adb-4298-84c6-cae58224d553] Received event network-vif-unplugged-28b88230-ad9c-48dd-a487-28043768f2c6 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:33:44 compute-0 nova_compute[189265]: 2025-09-30 07:33:44.952 2 DEBUG nova.compute.manager [req-9aff6627-2b6c-4abc-8a27-df5520441f44 req-c1798da6-7d71-4831-889a-ac056f19cdb2 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: f680ae8a-3adb-4298-84c6-cae58224d553] Received event network-vif-deleted-28b88230-ad9c-48dd-a487-28043768f2c6 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:33:44 compute-0 nova_compute[189265]: 2025-09-30 07:33:44.952 2 INFO nova.compute.manager [req-9aff6627-2b6c-4abc-8a27-df5520441f44 req-c1798da6-7d71-4831-889a-ac056f19cdb2 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: f680ae8a-3adb-4298-84c6-cae58224d553] Neutron deleted interface 28b88230-ad9c-48dd-a487-28043768f2c6; detaching it from the instance and deleting it from the info cache
Sep 30 07:33:44 compute-0 nova_compute[189265]: 2025-09-30 07:33:44.953 2 DEBUG nova.network.neutron [req-9aff6627-2b6c-4abc-8a27-df5520441f44 req-c1798da6-7d71-4831-889a-ac056f19cdb2 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: f680ae8a-3adb-4298-84c6-cae58224d553] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:33:45 compute-0 nova_compute[189265]: 2025-09-30 07:33:45.096 2 DEBUG nova.network.neutron [-] [instance: f680ae8a-3adb-4298-84c6-cae58224d553] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:33:45 compute-0 nova_compute[189265]: 2025-09-30 07:33:45.463 2 DEBUG nova.compute.manager [req-9aff6627-2b6c-4abc-8a27-df5520441f44 req-c1798da6-7d71-4831-889a-ac056f19cdb2 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: f680ae8a-3adb-4298-84c6-cae58224d553] Detach interface failed, port_id=28b88230-ad9c-48dd-a487-28043768f2c6, reason: Instance f680ae8a-3adb-4298-84c6-cae58224d553 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Sep 30 07:33:45 compute-0 nova_compute[189265]: 2025-09-30 07:33:45.603 2 INFO nova.compute.manager [-] [instance: f680ae8a-3adb-4298-84c6-cae58224d553] Took 1.77 seconds to deallocate network for instance.
Sep 30 07:33:45 compute-0 nova_compute[189265]: 2025-09-30 07:33:45.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:33:46 compute-0 nova_compute[189265]: 2025-09-30 07:33:46.126 2 DEBUG oslo_concurrency.lockutils [None req-9b210b13-ad96-421c-aa3e-2aec378fddb9 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:33:46 compute-0 nova_compute[189265]: 2025-09-30 07:33:46.127 2 DEBUG oslo_concurrency.lockutils [None req-9b210b13-ad96-421c-aa3e-2aec378fddb9 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:33:46 compute-0 nova_compute[189265]: 2025-09-30 07:33:46.133 2 DEBUG oslo_concurrency.lockutils [None req-9b210b13-ad96-421c-aa3e-2aec378fddb9 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.007s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:33:46 compute-0 nova_compute[189265]: 2025-09-30 07:33:46.165 2 INFO nova.scheduler.client.report [None req-9b210b13-ad96-421c-aa3e-2aec378fddb9 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Deleted allocations for instance f680ae8a-3adb-4298-84c6-cae58224d553
Sep 30 07:33:46 compute-0 podman[220306]: 2025-09-30 07:33:46.494751054 +0000 UTC m=+0.074838986 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=9.6, io.openshift.expose-services=, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, config_id=edpm, io.buildah.version=1.33.7, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git)
Sep 30 07:33:46 compute-0 nova_compute[189265]: 2025-09-30 07:33:46.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:33:46 compute-0 nova_compute[189265]: 2025-09-30 07:33:46.787 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:33:47 compute-0 nova_compute[189265]: 2025-09-30 07:33:47.195 2 DEBUG oslo_concurrency.lockutils [None req-9b210b13-ad96-421c-aa3e-2aec378fddb9 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "f680ae8a-3adb-4298-84c6-cae58224d553" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.240s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:33:48 compute-0 nova_compute[189265]: 2025-09-30 07:33:48.309 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:48 compute-0 nova_compute[189265]: 2025-09-30 07:33:48.780 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:49 compute-0 nova_compute[189265]: 2025-09-30 07:33:49.667 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:33:50 compute-0 podman[220330]: 2025-09-30 07:33:50.484097203 +0000 UTC m=+0.065381014 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest)
Sep 30 07:33:50 compute-0 podman[220331]: 2025-09-30 07:33:50.484243227 +0000 UTC m=+0.062818440 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Sep 30 07:33:50 compute-0 podman[220332]: 2025-09-30 07:33:50.540298861 +0000 UTC m=+0.108462154 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.license=GPLv2)
Sep 30 07:33:51 compute-0 nova_compute[189265]: 2025-09-30 07:33:51.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:33:53 compute-0 nova_compute[189265]: 2025-09-30 07:33:53.311 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:53 compute-0 nova_compute[189265]: 2025-09-30 07:33:53.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:54 compute-0 nova_compute[189265]: 2025-09-30 07:33:54.300 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:33:54 compute-0 nova_compute[189265]: 2025-09-30 07:33:54.301 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:33:54 compute-0 nova_compute[189265]: 2025-09-30 07:33:54.301 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:33:54 compute-0 nova_compute[189265]: 2025-09-30 07:33:54.839 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:33:54 compute-0 nova_compute[189265]: 2025-09-30 07:33:54.840 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:33:54 compute-0 nova_compute[189265]: 2025-09-30 07:33:54.840 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:33:54 compute-0 nova_compute[189265]: 2025-09-30 07:33:54.841 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:33:55 compute-0 nova_compute[189265]: 2025-09-30 07:33:55.025 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:33:55 compute-0 nova_compute[189265]: 2025-09-30 07:33:55.026 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:33:55 compute-0 nova_compute[189265]: 2025-09-30 07:33:55.064 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.038s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:33:55 compute-0 nova_compute[189265]: 2025-09-30 07:33:55.065 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5846MB free_disk=73.30380249023438GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:33:55 compute-0 nova_compute[189265]: 2025-09-30 07:33:55.065 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:33:55 compute-0 nova_compute[189265]: 2025-09-30 07:33:55.065 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:33:56 compute-0 nova_compute[189265]: 2025-09-30 07:33:56.110 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:33:56 compute-0 nova_compute[189265]: 2025-09-30 07:33:56.111 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:33:55 up  1:31,  0 user,  load average: 0.21, 0.23, 0.28\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:33:56 compute-0 nova_compute[189265]: 2025-09-30 07:33:56.138 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:33:56 compute-0 nova_compute[189265]: 2025-09-30 07:33:56.646 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:33:57 compute-0 nova_compute[189265]: 2025-09-30 07:33:57.157 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:33:57 compute-0 nova_compute[189265]: 2025-09-30 07:33:57.158 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.092s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:33:58 compute-0 nova_compute[189265]: 2025-09-30 07:33:58.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:58 compute-0 nova_compute[189265]: 2025-09-30 07:33:58.844 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:33:59 compute-0 nova_compute[189265]: 2025-09-30 07:33:59.641 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:33:59 compute-0 podman[199733]: time="2025-09-30T07:33:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:33:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:33:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:33:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:33:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3012 "" "Go-http-client/1.1"
Sep 30 07:34:00 compute-0 nova_compute[189265]: 2025-09-30 07:34:00.152 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:34:01 compute-0 openstack_network_exporter[201859]: ERROR   07:34:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:34:01 compute-0 openstack_network_exporter[201859]: ERROR   07:34:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:34:01 compute-0 openstack_network_exporter[201859]: ERROR   07:34:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:34:01 compute-0 openstack_network_exporter[201859]: ERROR   07:34:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:34:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:34:01 compute-0 openstack_network_exporter[201859]: ERROR   07:34:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:34:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:34:03 compute-0 nova_compute[189265]: 2025-09-30 07:34:03.315 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:34:03 compute-0 nova_compute[189265]: 2025-09-30 07:34:03.895 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:34:04 compute-0 podman[220393]: 2025-09-30 07:34:04.507170891 +0000 UTC m=+0.086186403 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 07:34:04 compute-0 nova_compute[189265]: 2025-09-30 07:34:04.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:34:04 compute-0 nova_compute[189265]: 2025-09-30 07:34:04.788 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Sep 30 07:34:05 compute-0 nova_compute[189265]: 2025-09-30 07:34:05.297 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Sep 30 07:34:06 compute-0 unix_chkpwd[220420]: password check failed for user (root)
Sep 30 07:34:06 compute-0 sshd-session[220418]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.20  user=root
Sep 30 07:34:06 compute-0 nova_compute[189265]: 2025-09-30 07:34:06.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:34:06 compute-0 nova_compute[189265]: 2025-09-30 07:34:06.788 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Sep 30 07:34:08 compute-0 nova_compute[189265]: 2025-09-30 07:34:08.318 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:34:08 compute-0 sshd-session[220418]: Failed password for root from 193.46.255.20 port 16846 ssh2
Sep 30 07:34:08 compute-0 nova_compute[189265]: 2025-09-30 07:34:08.934 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:34:10 compute-0 unix_chkpwd[220421]: password check failed for user (root)
Sep 30 07:34:13 compute-0 sshd-session[220418]: Failed password for root from 193.46.255.20 port 16846 ssh2
Sep 30 07:34:13 compute-0 nova_compute[189265]: 2025-09-30 07:34:13.320 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:34:13 compute-0 podman[220422]: 2025-09-30 07:34:13.506356648 +0000 UTC m=+0.086407310 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20250930, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team)
Sep 30 07:34:13 compute-0 nova_compute[189265]: 2025-09-30 07:34:13.953 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:34:14 compute-0 unix_chkpwd[220443]: password check failed for user (root)
Sep 30 07:34:16 compute-0 sshd-session[220418]: Failed password for root from 193.46.255.20 port 16846 ssh2
Sep 30 07:34:17 compute-0 sshd-session[220418]: Received disconnect from 193.46.255.20 port 16846:11:  [preauth]
Sep 30 07:34:17 compute-0 sshd-session[220418]: Disconnected from authenticating user root 193.46.255.20 port 16846 [preauth]
Sep 30 07:34:17 compute-0 sshd-session[220418]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.20  user=root
Sep 30 07:34:17 compute-0 podman[220446]: 2025-09-30 07:34:17.506190339 +0000 UTC m=+0.085247876 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, container_name=openstack_network_exporter, name=ubi9-minimal, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, release=1755695350)
Sep 30 07:34:17 compute-0 unix_chkpwd[220467]: password check failed for user (root)
Sep 30 07:34:17 compute-0 sshd-session[220444]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.20  user=root
Sep 30 07:34:18 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:34:18.043 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '1a:26:7c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2e:60:fa:91:d0:34'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:34:18 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:34:18.043 100322 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 07:34:18 compute-0 nova_compute[189265]: 2025-09-30 07:34:18.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:34:18 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:34:18.046 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=01429670-4ea1-4dab-babc-4bc628cc01bb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:34:18 compute-0 nova_compute[189265]: 2025-09-30 07:34:18.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:34:19 compute-0 nova_compute[189265]: 2025-09-30 07:34:19.006 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:34:19 compute-0 sshd-session[220444]: Failed password for root from 193.46.255.20 port 36298 ssh2
Sep 30 07:34:20 compute-0 unix_chkpwd[220469]: password check failed for user (root)
Sep 30 07:34:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:34:20.567 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:34:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:34:20.568 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:34:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:34:20.568 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:34:21 compute-0 podman[220471]: 2025-09-30 07:34:21.514561135 +0000 UTC m=+0.099912828 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0)
Sep 30 07:34:21 compute-0 podman[220472]: 2025-09-30 07:34:21.517193881 +0000 UTC m=+0.088763678 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 07:34:21 compute-0 podman[220473]: 2025-09-30 07:34:21.554654909 +0000 UTC m=+0.124300750 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Sep 30 07:34:21 compute-0 nova_compute[189265]: 2025-09-30 07:34:21.640 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:34:22 compute-0 sshd-session[220444]: Failed password for root from 193.46.255.20 port 36298 ssh2
Sep 30 07:34:23 compute-0 nova_compute[189265]: 2025-09-30 07:34:23.325 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:34:24 compute-0 nova_compute[189265]: 2025-09-30 07:34:24.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:34:24 compute-0 unix_chkpwd[220535]: password check failed for user (root)
Sep 30 07:34:26 compute-0 sshd-session[220444]: Failed password for root from 193.46.255.20 port 36298 ssh2
Sep 30 07:34:27 compute-0 nova_compute[189265]: 2025-09-30 07:34:27.693 2 DEBUG oslo_concurrency.lockutils [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquiring lock "396e11ed-839f-4c1e-be94-410ca9634b50" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:34:27 compute-0 nova_compute[189265]: 2025-09-30 07:34:27.693 2 DEBUG oslo_concurrency.lockutils [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "396e11ed-839f-4c1e-be94-410ca9634b50" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:34:28 compute-0 nova_compute[189265]: 2025-09-30 07:34:28.226 2 DEBUG nova.compute.manager [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Sep 30 07:34:28 compute-0 nova_compute[189265]: 2025-09-30 07:34:28.327 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:34:28 compute-0 sshd-session[220444]: Received disconnect from 193.46.255.20 port 36298:11:  [preauth]
Sep 30 07:34:28 compute-0 sshd-session[220444]: Disconnected from authenticating user root 193.46.255.20 port 36298 [preauth]
Sep 30 07:34:28 compute-0 sshd-session[220444]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.20  user=root
Sep 30 07:34:28 compute-0 nova_compute[189265]: 2025-09-30 07:34:28.832 2 DEBUG oslo_concurrency.lockutils [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:34:28 compute-0 nova_compute[189265]: 2025-09-30 07:34:28.833 2 DEBUG oslo_concurrency.lockutils [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:34:28 compute-0 nova_compute[189265]: 2025-09-30 07:34:28.842 2 DEBUG nova.virt.hardware [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Sep 30 07:34:28 compute-0 nova_compute[189265]: 2025-09-30 07:34:28.843 2 INFO nova.compute.claims [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Claim successful on node compute-0.ctlplane.example.com
Sep 30 07:34:29 compute-0 nova_compute[189265]: 2025-09-30 07:34:29.012 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:34:29 compute-0 unix_chkpwd[220538]: password check failed for user (root)
Sep 30 07:34:29 compute-0 sshd-session[220536]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.20  user=root
Sep 30 07:34:29 compute-0 podman[199733]: time="2025-09-30T07:34:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:34:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:34:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:34:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:34:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3007 "" "Go-http-client/1.1"
Sep 30 07:34:29 compute-0 nova_compute[189265]: 2025-09-30 07:34:29.917 2 DEBUG nova.compute.provider_tree [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:34:30 compute-0 nova_compute[189265]: 2025-09-30 07:34:30.437 2 DEBUG nova.scheduler.client.report [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:34:31 compute-0 nova_compute[189265]: 2025-09-30 07:34:31.056 2 DEBUG oslo_concurrency.lockutils [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.223s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:34:31 compute-0 nova_compute[189265]: 2025-09-30 07:34:31.057 2 DEBUG nova.compute.manager [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Sep 30 07:34:31 compute-0 sshd-session[220536]: Failed password for root from 193.46.255.20 port 30364 ssh2
Sep 30 07:34:31 compute-0 unix_chkpwd[220539]: password check failed for user (root)
Sep 30 07:34:31 compute-0 openstack_network_exporter[201859]: ERROR   07:34:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:34:31 compute-0 openstack_network_exporter[201859]: ERROR   07:34:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:34:31 compute-0 openstack_network_exporter[201859]: ERROR   07:34:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:34:31 compute-0 openstack_network_exporter[201859]: ERROR   07:34:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:34:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:34:31 compute-0 openstack_network_exporter[201859]: ERROR   07:34:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:34:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:34:31 compute-0 nova_compute[189265]: 2025-09-30 07:34:31.572 2 DEBUG nova.compute.manager [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Sep 30 07:34:31 compute-0 nova_compute[189265]: 2025-09-30 07:34:31.573 2 DEBUG nova.network.neutron [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Sep 30 07:34:31 compute-0 nova_compute[189265]: 2025-09-30 07:34:31.573 2 WARNING neutronclient.v2_0.client [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:34:31 compute-0 nova_compute[189265]: 2025-09-30 07:34:31.574 2 WARNING neutronclient.v2_0.client [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:34:32 compute-0 nova_compute[189265]: 2025-09-30 07:34:32.097 2 INFO nova.virt.libvirt.driver [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 07:34:32 compute-0 nova_compute[189265]: 2025-09-30 07:34:32.613 2 DEBUG nova.compute.manager [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Sep 30 07:34:32 compute-0 nova_compute[189265]: 2025-09-30 07:34:32.632 2 DEBUG nova.network.neutron [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Successfully created port: bf4216e4-2e68-4aff-8fec-7de17189c27c _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Sep 30 07:34:33 compute-0 nova_compute[189265]: 2025-09-30 07:34:33.330 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:34:33 compute-0 nova_compute[189265]: 2025-09-30 07:34:33.633 2 DEBUG nova.network.neutron [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Successfully updated port: bf4216e4-2e68-4aff-8fec-7de17189c27c _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Sep 30 07:34:33 compute-0 nova_compute[189265]: 2025-09-30 07:34:33.657 2 DEBUG nova.compute.manager [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Sep 30 07:34:33 compute-0 nova_compute[189265]: 2025-09-30 07:34:33.659 2 DEBUG nova.virt.libvirt.driver [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Sep 30 07:34:33 compute-0 nova_compute[189265]: 2025-09-30 07:34:33.660 2 INFO nova.virt.libvirt.driver [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Creating image(s)
Sep 30 07:34:33 compute-0 nova_compute[189265]: 2025-09-30 07:34:33.661 2 DEBUG oslo_concurrency.lockutils [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquiring lock "/var/lib/nova/instances/396e11ed-839f-4c1e-be94-410ca9634b50/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:34:33 compute-0 nova_compute[189265]: 2025-09-30 07:34:33.661 2 DEBUG oslo_concurrency.lockutils [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "/var/lib/nova/instances/396e11ed-839f-4c1e-be94-410ca9634b50/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:34:33 compute-0 nova_compute[189265]: 2025-09-30 07:34:33.662 2 DEBUG oslo_concurrency.lockutils [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "/var/lib/nova/instances/396e11ed-839f-4c1e-be94-410ca9634b50/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:34:33 compute-0 nova_compute[189265]: 2025-09-30 07:34:33.663 2 DEBUG oslo_utils.imageutils.format_inspector [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:34:33 compute-0 nova_compute[189265]: 2025-09-30 07:34:33.669 2 DEBUG oslo_utils.imageutils.format_inspector [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:34:33 compute-0 nova_compute[189265]: 2025-09-30 07:34:33.671 2 DEBUG oslo_concurrency.processutils [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:34:33 compute-0 nova_compute[189265]: 2025-09-30 07:34:33.699 2 DEBUG nova.compute.manager [req-4f66fb10-2c18-4920-8562-f175cfdee994 req-ed51f7ab-a77c-4917-86da-9d3d1f09bc8a 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Received event network-changed-bf4216e4-2e68-4aff-8fec-7de17189c27c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:34:33 compute-0 nova_compute[189265]: 2025-09-30 07:34:33.700 2 DEBUG nova.compute.manager [req-4f66fb10-2c18-4920-8562-f175cfdee994 req-ed51f7ab-a77c-4917-86da-9d3d1f09bc8a 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Refreshing instance network info cache due to event network-changed-bf4216e4-2e68-4aff-8fec-7de17189c27c. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 07:34:33 compute-0 nova_compute[189265]: 2025-09-30 07:34:33.700 2 DEBUG oslo_concurrency.lockutils [req-4f66fb10-2c18-4920-8562-f175cfdee994 req-ed51f7ab-a77c-4917-86da-9d3d1f09bc8a 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "refresh_cache-396e11ed-839f-4c1e-be94-410ca9634b50" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:34:33 compute-0 nova_compute[189265]: 2025-09-30 07:34:33.701 2 DEBUG oslo_concurrency.lockutils [req-4f66fb10-2c18-4920-8562-f175cfdee994 req-ed51f7ab-a77c-4917-86da-9d3d1f09bc8a 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquired lock "refresh_cache-396e11ed-839f-4c1e-be94-410ca9634b50" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:34:33 compute-0 nova_compute[189265]: 2025-09-30 07:34:33.701 2 DEBUG nova.network.neutron [req-4f66fb10-2c18-4920-8562-f175cfdee994 req-ed51f7ab-a77c-4917-86da-9d3d1f09bc8a 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Refreshing network info cache for port bf4216e4-2e68-4aff-8fec-7de17189c27c _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 07:34:33 compute-0 nova_compute[189265]: 2025-09-30 07:34:33.751 2 DEBUG oslo_concurrency.processutils [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:34:33 compute-0 nova_compute[189265]: 2025-09-30 07:34:33.752 2 DEBUG oslo_concurrency.lockutils [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquiring lock "649c128805005f3dfb5a93843c58a367cdfe939d" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:34:33 compute-0 nova_compute[189265]: 2025-09-30 07:34:33.753 2 DEBUG oslo_concurrency.lockutils [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "649c128805005f3dfb5a93843c58a367cdfe939d" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:34:33 compute-0 nova_compute[189265]: 2025-09-30 07:34:33.754 2 DEBUG oslo_utils.imageutils.format_inspector [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:34:33 compute-0 nova_compute[189265]: 2025-09-30 07:34:33.760 2 DEBUG oslo_utils.imageutils.format_inspector [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:34:33 compute-0 nova_compute[189265]: 2025-09-30 07:34:33.761 2 DEBUG oslo_concurrency.processutils [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:34:33 compute-0 nova_compute[189265]: 2025-09-30 07:34:33.828 2 DEBUG oslo_concurrency.processutils [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:34:33 compute-0 nova_compute[189265]: 2025-09-30 07:34:33.829 2 DEBUG oslo_concurrency.processutils [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d,backing_fmt=raw /var/lib/nova/instances/396e11ed-839f-4c1e-be94-410ca9634b50/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:34:33 compute-0 nova_compute[189265]: 2025-09-30 07:34:33.868 2 DEBUG oslo_concurrency.processutils [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d,backing_fmt=raw /var/lib/nova/instances/396e11ed-839f-4c1e-be94-410ca9634b50/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:34:33 compute-0 sshd-session[220536]: Failed password for root from 193.46.255.20 port 30364 ssh2
Sep 30 07:34:33 compute-0 nova_compute[189265]: 2025-09-30 07:34:33.869 2 DEBUG oslo_concurrency.lockutils [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "649c128805005f3dfb5a93843c58a367cdfe939d" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.116s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:34:33 compute-0 nova_compute[189265]: 2025-09-30 07:34:33.870 2 DEBUG oslo_concurrency.processutils [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:34:33 compute-0 nova_compute[189265]: 2025-09-30 07:34:33.926 2 DEBUG oslo_concurrency.processutils [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:34:33 compute-0 nova_compute[189265]: 2025-09-30 07:34:33.927 2 DEBUG nova.virt.disk.api [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Checking if we can resize image /var/lib/nova/instances/396e11ed-839f-4c1e-be94-410ca9634b50/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Sep 30 07:34:33 compute-0 nova_compute[189265]: 2025-09-30 07:34:33.928 2 DEBUG oslo_concurrency.processutils [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/396e11ed-839f-4c1e-be94-410ca9634b50/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:34:34 compute-0 nova_compute[189265]: 2025-09-30 07:34:34.013 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:34:34 compute-0 nova_compute[189265]: 2025-09-30 07:34:34.017 2 DEBUG oslo_concurrency.processutils [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/396e11ed-839f-4c1e-be94-410ca9634b50/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:34:34 compute-0 nova_compute[189265]: 2025-09-30 07:34:34.018 2 DEBUG nova.virt.disk.api [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Cannot resize image /var/lib/nova/instances/396e11ed-839f-4c1e-be94-410ca9634b50/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Sep 30 07:34:34 compute-0 nova_compute[189265]: 2025-09-30 07:34:34.019 2 DEBUG nova.virt.libvirt.driver [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Sep 30 07:34:34 compute-0 nova_compute[189265]: 2025-09-30 07:34:34.019 2 DEBUG nova.virt.libvirt.driver [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Ensure instance console log exists: /var/lib/nova/instances/396e11ed-839f-4c1e-be94-410ca9634b50/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 07:34:34 compute-0 nova_compute[189265]: 2025-09-30 07:34:34.020 2 DEBUG oslo_concurrency.lockutils [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:34:34 compute-0 nova_compute[189265]: 2025-09-30 07:34:34.020 2 DEBUG oslo_concurrency.lockutils [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:34:34 compute-0 nova_compute[189265]: 2025-09-30 07:34:34.021 2 DEBUG oslo_concurrency.lockutils [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:34:34 compute-0 nova_compute[189265]: 2025-09-30 07:34:34.143 2 DEBUG oslo_concurrency.lockutils [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquiring lock "refresh_cache-396e11ed-839f-4c1e-be94-410ca9634b50" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:34:34 compute-0 nova_compute[189265]: 2025-09-30 07:34:34.386 2 WARNING neutronclient.v2_0.client [req-4f66fb10-2c18-4920-8562-f175cfdee994 req-ed51f7ab-a77c-4917-86da-9d3d1f09bc8a 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:34:34 compute-0 nova_compute[189265]: 2025-09-30 07:34:34.652 2 DEBUG nova.network.neutron [req-4f66fb10-2c18-4920-8562-f175cfdee994 req-ed51f7ab-a77c-4917-86da-9d3d1f09bc8a 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 07:34:34 compute-0 nova_compute[189265]: 2025-09-30 07:34:34.862 2 DEBUG nova.network.neutron [req-4f66fb10-2c18-4920-8562-f175cfdee994 req-ed51f7ab-a77c-4917-86da-9d3d1f09bc8a 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:34:35 compute-0 nova_compute[189265]: 2025-09-30 07:34:35.376 2 DEBUG oslo_concurrency.lockutils [req-4f66fb10-2c18-4920-8562-f175cfdee994 req-ed51f7ab-a77c-4917-86da-9d3d1f09bc8a 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Releasing lock "refresh_cache-396e11ed-839f-4c1e-be94-410ca9634b50" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:34:35 compute-0 nova_compute[189265]: 2025-09-30 07:34:35.377 2 DEBUG oslo_concurrency.lockutils [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquired lock "refresh_cache-396e11ed-839f-4c1e-be94-410ca9634b50" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:34:35 compute-0 nova_compute[189265]: 2025-09-30 07:34:35.378 2 DEBUG nova.network.neutron [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 07:34:35 compute-0 podman[220555]: 2025-09-30 07:34:35.465154917 +0000 UTC m=+0.055555291 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 07:34:35 compute-0 unix_chkpwd[220579]: password check failed for user (root)
Sep 30 07:34:36 compute-0 nova_compute[189265]: 2025-09-30 07:34:36.818 2 DEBUG nova.network.neutron [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 07:34:37 compute-0 sshd-session[220536]: Failed password for root from 193.46.255.20 port 30364 ssh2
Sep 30 07:34:37 compute-0 nova_compute[189265]: 2025-09-30 07:34:37.604 2 WARNING neutronclient.v2_0.client [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:34:37 compute-0 sshd-session[220536]: Received disconnect from 193.46.255.20 port 30364:11:  [preauth]
Sep 30 07:34:37 compute-0 sshd-session[220536]: Disconnected from authenticating user root 193.46.255.20 port 30364 [preauth]
Sep 30 07:34:37 compute-0 sshd-session[220536]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.20  user=root
Sep 30 07:34:37 compute-0 nova_compute[189265]: 2025-09-30 07:34:37.804 2 DEBUG nova.network.neutron [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Updating instance_info_cache with network_info: [{"id": "bf4216e4-2e68-4aff-8fec-7de17189c27c", "address": "fa:16:3e:d6:c0:ac", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf4216e4-2e", "ovs_interfaceid": "bf4216e4-2e68-4aff-8fec-7de17189c27c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:34:38 compute-0 nova_compute[189265]: 2025-09-30 07:34:38.312 2 DEBUG oslo_concurrency.lockutils [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Releasing lock "refresh_cache-396e11ed-839f-4c1e-be94-410ca9634b50" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:34:38 compute-0 nova_compute[189265]: 2025-09-30 07:34:38.313 2 DEBUG nova.compute.manager [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Instance network_info: |[{"id": "bf4216e4-2e68-4aff-8fec-7de17189c27c", "address": "fa:16:3e:d6:c0:ac", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf4216e4-2e", "ovs_interfaceid": "bf4216e4-2e68-4aff-8fec-7de17189c27c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Sep 30 07:34:38 compute-0 nova_compute[189265]: 2025-09-30 07:34:38.317 2 DEBUG nova.virt.libvirt.driver [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Start _get_guest_xml network_info=[{"id": "bf4216e4-2e68-4aff-8fec-7de17189c27c", "address": "fa:16:3e:d6:c0:ac", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf4216e4-2e", "ovs_interfaceid": "bf4216e4-2e68-4aff-8fec-7de17189c27c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T07:07:59Z,direct_url=<?>,disk_format='qcow2',id=0c6b92f5-9861-49e4-862d-3ffd84520dfa,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4049964ce8244dacb50493f6676c6613',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T07:08:00Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'encryption_format': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encrypted': False, 'image_id': '0c6b92f5-9861-49e4-862d-3ffd84520dfa'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Sep 30 07:34:38 compute-0 nova_compute[189265]: 2025-09-30 07:34:38.323 2 WARNING nova.virt.libvirt.driver [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:34:38 compute-0 nova_compute[189265]: 2025-09-30 07:34:38.325 2 DEBUG nova.virt.driver [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='0c6b92f5-9861-49e4-862d-3ffd84520dfa', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteStrategies-server-314749624', uuid='396e11ed-839f-4c1e-be94-410ca9634b50'), owner=OwnerMeta(userid='89ba5d19014145188ad2a3c812acdc88', username='tempest-TestExecuteStrategies-1096120513-project-admin', projectid='6431607f3dce4c88bbf6d17ee6cd45b2', projectname='tempest-TestExecuteStrategies-1096120513'), image=ImageMeta(id='0c6b92f5-9861-49e4-862d-3ffd84520dfa', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='ded17455-f8fe-40c7-8dae-6f0a2b208ae0', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "bf4216e4-2e68-4aff-8fec-7de17189c27c", "address": "fa:16:3e:d6:c0:ac", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf4216e4-2e", "ovs_interfaceid": "bf4216e4-2e68-4aff-8fec-7de17189c27c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759217678.325282) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Sep 30 07:34:38 compute-0 nova_compute[189265]: 2025-09-30 07:34:38.332 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:34:38 compute-0 nova_compute[189265]: 2025-09-30 07:34:38.334 2 DEBUG nova.virt.libvirt.host [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Sep 30 07:34:38 compute-0 nova_compute[189265]: 2025-09-30 07:34:38.335 2 DEBUG nova.virt.libvirt.host [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Sep 30 07:34:38 compute-0 nova_compute[189265]: 2025-09-30 07:34:38.339 2 DEBUG nova.virt.libvirt.host [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Sep 30 07:34:38 compute-0 nova_compute[189265]: 2025-09-30 07:34:38.340 2 DEBUG nova.virt.libvirt.host [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Sep 30 07:34:38 compute-0 nova_compute[189265]: 2025-09-30 07:34:38.341 2 DEBUG nova.virt.libvirt.driver [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 07:34:38 compute-0 nova_compute[189265]: 2025-09-30 07:34:38.341 2 DEBUG nova.virt.hardware [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T07:07:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='ded17455-f8fe-40c7-8dae-6f0a2b208ae0',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T07:07:59Z,direct_url=<?>,disk_format='qcow2',id=0c6b92f5-9861-49e4-862d-3ffd84520dfa,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4049964ce8244dacb50493f6676c6613',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T07:08:00Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Sep 30 07:34:38 compute-0 nova_compute[189265]: 2025-09-30 07:34:38.342 2 DEBUG nova.virt.hardware [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Sep 30 07:34:38 compute-0 nova_compute[189265]: 2025-09-30 07:34:38.342 2 DEBUG nova.virt.hardware [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Sep 30 07:34:38 compute-0 nova_compute[189265]: 2025-09-30 07:34:38.343 2 DEBUG nova.virt.hardware [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Sep 30 07:34:38 compute-0 nova_compute[189265]: 2025-09-30 07:34:38.343 2 DEBUG nova.virt.hardware [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Sep 30 07:34:38 compute-0 nova_compute[189265]: 2025-09-30 07:34:38.344 2 DEBUG nova.virt.hardware [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Sep 30 07:34:38 compute-0 nova_compute[189265]: 2025-09-30 07:34:38.344 2 DEBUG nova.virt.hardware [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Sep 30 07:34:38 compute-0 nova_compute[189265]: 2025-09-30 07:34:38.345 2 DEBUG nova.virt.hardware [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Sep 30 07:34:38 compute-0 nova_compute[189265]: 2025-09-30 07:34:38.345 2 DEBUG nova.virt.hardware [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Sep 30 07:34:38 compute-0 nova_compute[189265]: 2025-09-30 07:34:38.346 2 DEBUG nova.virt.hardware [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Sep 30 07:34:38 compute-0 nova_compute[189265]: 2025-09-30 07:34:38.346 2 DEBUG nova.virt.hardware [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Sep 30 07:34:38 compute-0 nova_compute[189265]: 2025-09-30 07:34:38.353 2 DEBUG nova.virt.libvirt.vif [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T07:34:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-314749624',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-314749624',id=21,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6431607f3dce4c88bbf6d17ee6cd45b2',ramdisk_id='',reservation_id='r-qh7t6g2f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1096120513',owner_user_name='tempest-TestExecuteStrategies-1096120513-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T07:34:32Z,user_data=None,user_id='89ba5d19014145188ad2a3c812acdc88',uuid=396e11ed-839f-4c1e-be94-410ca9634b50,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bf4216e4-2e68-4aff-8fec-7de17189c27c", "address": "fa:16:3e:d6:c0:ac", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf4216e4-2e", "ovs_interfaceid": "bf4216e4-2e68-4aff-8fec-7de17189c27c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 07:34:38 compute-0 nova_compute[189265]: 2025-09-30 07:34:38.354 2 DEBUG nova.network.os_vif_util [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Converting VIF {"id": "bf4216e4-2e68-4aff-8fec-7de17189c27c", "address": "fa:16:3e:d6:c0:ac", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf4216e4-2e", "ovs_interfaceid": "bf4216e4-2e68-4aff-8fec-7de17189c27c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:34:38 compute-0 nova_compute[189265]: 2025-09-30 07:34:38.355 2 DEBUG nova.network.os_vif_util [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d6:c0:ac,bridge_name='br-int',has_traffic_filtering=True,id=bf4216e4-2e68-4aff-8fec-7de17189c27c,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbf4216e4-2e') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:34:38 compute-0 nova_compute[189265]: 2025-09-30 07:34:38.357 2 DEBUG nova.objects.instance [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lazy-loading 'pci_devices' on Instance uuid 396e11ed-839f-4c1e-be94-410ca9634b50 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:34:38 compute-0 nova_compute[189265]: 2025-09-30 07:34:38.890 2 DEBUG nova.virt.libvirt.driver [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] End _get_guest_xml xml=<domain type="kvm">
Sep 30 07:34:38 compute-0 nova_compute[189265]:   <uuid>396e11ed-839f-4c1e-be94-410ca9634b50</uuid>
Sep 30 07:34:38 compute-0 nova_compute[189265]:   <name>instance-00000015</name>
Sep 30 07:34:38 compute-0 nova_compute[189265]:   <memory>131072</memory>
Sep 30 07:34:38 compute-0 nova_compute[189265]:   <vcpu>1</vcpu>
Sep 30 07:34:38 compute-0 nova_compute[189265]:   <metadata>
Sep 30 07:34:38 compute-0 nova_compute[189265]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 07:34:38 compute-0 nova_compute[189265]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:       <nova:name>tempest-TestExecuteStrategies-server-314749624</nova:name>
Sep 30 07:34:38 compute-0 nova_compute[189265]:       <nova:creationTime>2025-09-30 07:34:38</nova:creationTime>
Sep 30 07:34:38 compute-0 nova_compute[189265]:       <nova:flavor name="m1.nano" id="ded17455-f8fe-40c7-8dae-6f0a2b208ae0">
Sep 30 07:34:38 compute-0 nova_compute[189265]:         <nova:memory>128</nova:memory>
Sep 30 07:34:38 compute-0 nova_compute[189265]:         <nova:disk>1</nova:disk>
Sep 30 07:34:38 compute-0 nova_compute[189265]:         <nova:swap>0</nova:swap>
Sep 30 07:34:38 compute-0 nova_compute[189265]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 07:34:38 compute-0 nova_compute[189265]:         <nova:vcpus>1</nova:vcpus>
Sep 30 07:34:38 compute-0 nova_compute[189265]:         <nova:extraSpecs>
Sep 30 07:34:38 compute-0 nova_compute[189265]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 07:34:38 compute-0 nova_compute[189265]:         </nova:extraSpecs>
Sep 30 07:34:38 compute-0 nova_compute[189265]:       </nova:flavor>
Sep 30 07:34:38 compute-0 nova_compute[189265]:       <nova:image uuid="0c6b92f5-9861-49e4-862d-3ffd84520dfa">
Sep 30 07:34:38 compute-0 nova_compute[189265]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 07:34:38 compute-0 nova_compute[189265]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 07:34:38 compute-0 nova_compute[189265]:         <nova:minDisk>1</nova:minDisk>
Sep 30 07:34:38 compute-0 nova_compute[189265]:         <nova:minRam>0</nova:minRam>
Sep 30 07:34:38 compute-0 nova_compute[189265]:         <nova:properties>
Sep 30 07:34:38 compute-0 nova_compute[189265]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 07:34:38 compute-0 nova_compute[189265]:         </nova:properties>
Sep 30 07:34:38 compute-0 nova_compute[189265]:       </nova:image>
Sep 30 07:34:38 compute-0 nova_compute[189265]:       <nova:owner>
Sep 30 07:34:38 compute-0 nova_compute[189265]:         <nova:user uuid="89ba5d19014145188ad2a3c812acdc88">tempest-TestExecuteStrategies-1096120513-project-admin</nova:user>
Sep 30 07:34:38 compute-0 nova_compute[189265]:         <nova:project uuid="6431607f3dce4c88bbf6d17ee6cd45b2">tempest-TestExecuteStrategies-1096120513</nova:project>
Sep 30 07:34:38 compute-0 nova_compute[189265]:       </nova:owner>
Sep 30 07:34:38 compute-0 nova_compute[189265]:       <nova:root type="image" uuid="0c6b92f5-9861-49e4-862d-3ffd84520dfa"/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:       <nova:ports>
Sep 30 07:34:38 compute-0 nova_compute[189265]:         <nova:port uuid="bf4216e4-2e68-4aff-8fec-7de17189c27c">
Sep 30 07:34:38 compute-0 nova_compute[189265]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:         </nova:port>
Sep 30 07:34:38 compute-0 nova_compute[189265]:       </nova:ports>
Sep 30 07:34:38 compute-0 nova_compute[189265]:     </nova:instance>
Sep 30 07:34:38 compute-0 nova_compute[189265]:   </metadata>
Sep 30 07:34:38 compute-0 nova_compute[189265]:   <sysinfo type="smbios">
Sep 30 07:34:38 compute-0 nova_compute[189265]:     <system>
Sep 30 07:34:38 compute-0 nova_compute[189265]:       <entry name="manufacturer">RDO</entry>
Sep 30 07:34:38 compute-0 nova_compute[189265]:       <entry name="product">OpenStack Compute</entry>
Sep 30 07:34:38 compute-0 nova_compute[189265]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 07:34:38 compute-0 nova_compute[189265]:       <entry name="serial">396e11ed-839f-4c1e-be94-410ca9634b50</entry>
Sep 30 07:34:38 compute-0 nova_compute[189265]:       <entry name="uuid">396e11ed-839f-4c1e-be94-410ca9634b50</entry>
Sep 30 07:34:38 compute-0 nova_compute[189265]:       <entry name="family">Virtual Machine</entry>
Sep 30 07:34:38 compute-0 nova_compute[189265]:     </system>
Sep 30 07:34:38 compute-0 nova_compute[189265]:   </sysinfo>
Sep 30 07:34:38 compute-0 nova_compute[189265]:   <os>
Sep 30 07:34:38 compute-0 nova_compute[189265]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 07:34:38 compute-0 nova_compute[189265]:     <boot dev="hd"/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:     <smbios mode="sysinfo"/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:   </os>
Sep 30 07:34:38 compute-0 nova_compute[189265]:   <features>
Sep 30 07:34:38 compute-0 nova_compute[189265]:     <acpi/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:     <apic/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:     <vmcoreinfo/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:   </features>
Sep 30 07:34:38 compute-0 nova_compute[189265]:   <clock offset="utc">
Sep 30 07:34:38 compute-0 nova_compute[189265]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:     <timer name="hpet" present="no"/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:   </clock>
Sep 30 07:34:38 compute-0 nova_compute[189265]:   <cpu mode="host-model" match="exact">
Sep 30 07:34:38 compute-0 nova_compute[189265]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:   </cpu>
Sep 30 07:34:38 compute-0 nova_compute[189265]:   <devices>
Sep 30 07:34:38 compute-0 nova_compute[189265]:     <disk type="file" device="disk">
Sep 30 07:34:38 compute-0 nova_compute[189265]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:       <source file="/var/lib/nova/instances/396e11ed-839f-4c1e-be94-410ca9634b50/disk"/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:       <target dev="vda" bus="virtio"/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:     </disk>
Sep 30 07:34:38 compute-0 nova_compute[189265]:     <disk type="file" device="cdrom">
Sep 30 07:34:38 compute-0 nova_compute[189265]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:       <source file="/var/lib/nova/instances/396e11ed-839f-4c1e-be94-410ca9634b50/disk.config"/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:       <target dev="sda" bus="sata"/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:     </disk>
Sep 30 07:34:38 compute-0 nova_compute[189265]:     <interface type="ethernet">
Sep 30 07:34:38 compute-0 nova_compute[189265]:       <mac address="fa:16:3e:d6:c0:ac"/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:       <model type="virtio"/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:       <mtu size="1442"/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:       <target dev="tapbf4216e4-2e"/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:     </interface>
Sep 30 07:34:38 compute-0 nova_compute[189265]:     <serial type="pty">
Sep 30 07:34:38 compute-0 nova_compute[189265]:       <log file="/var/lib/nova/instances/396e11ed-839f-4c1e-be94-410ca9634b50/console.log" append="off"/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:     </serial>
Sep 30 07:34:38 compute-0 nova_compute[189265]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:     <video>
Sep 30 07:34:38 compute-0 nova_compute[189265]:       <model type="virtio"/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:     </video>
Sep 30 07:34:38 compute-0 nova_compute[189265]:     <input type="tablet" bus="usb"/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:     <rng model="virtio">
Sep 30 07:34:38 compute-0 nova_compute[189265]:       <backend model="random">/dev/urandom</backend>
Sep 30 07:34:38 compute-0 nova_compute[189265]:     </rng>
Sep 30 07:34:38 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root"/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:     <controller type="usb" index="0"/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 07:34:38 compute-0 nova_compute[189265]:       <stats period="10"/>
Sep 30 07:34:38 compute-0 nova_compute[189265]:     </memballoon>
Sep 30 07:34:38 compute-0 nova_compute[189265]:   </devices>
Sep 30 07:34:38 compute-0 nova_compute[189265]: </domain>
Sep 30 07:34:38 compute-0 nova_compute[189265]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Sep 30 07:34:38 compute-0 nova_compute[189265]: 2025-09-30 07:34:38.893 2 DEBUG nova.compute.manager [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Preparing to wait for external event network-vif-plugged-bf4216e4-2e68-4aff-8fec-7de17189c27c prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 07:34:38 compute-0 nova_compute[189265]: 2025-09-30 07:34:38.893 2 DEBUG oslo_concurrency.lockutils [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquiring lock "396e11ed-839f-4c1e-be94-410ca9634b50-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:34:38 compute-0 nova_compute[189265]: 2025-09-30 07:34:38.894 2 DEBUG oslo_concurrency.lockutils [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "396e11ed-839f-4c1e-be94-410ca9634b50-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:34:38 compute-0 nova_compute[189265]: 2025-09-30 07:34:38.894 2 DEBUG oslo_concurrency.lockutils [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "396e11ed-839f-4c1e-be94-410ca9634b50-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:34:38 compute-0 nova_compute[189265]: 2025-09-30 07:34:38.895 2 DEBUG nova.virt.libvirt.vif [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T07:34:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-314749624',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-314749624',id=21,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6431607f3dce4c88bbf6d17ee6cd45b2',ramdisk_id='',reservation_id='r-qh7t6g2f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteStrategies-1096120513',owner_user_name='tempest-TestExecuteStrategies-1096120513-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T07:34:32Z,user_data=None,user_id='89ba5d19014145188ad2a3c812acdc88',uuid=396e11ed-839f-4c1e-be94-410ca9634b50,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bf4216e4-2e68-4aff-8fec-7de17189c27c", "address": "fa:16:3e:d6:c0:ac", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf4216e4-2e", "ovs_interfaceid": "bf4216e4-2e68-4aff-8fec-7de17189c27c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 07:34:38 compute-0 nova_compute[189265]: 2025-09-30 07:34:38.896 2 DEBUG nova.network.os_vif_util [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Converting VIF {"id": "bf4216e4-2e68-4aff-8fec-7de17189c27c", "address": "fa:16:3e:d6:c0:ac", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf4216e4-2e", "ovs_interfaceid": "bf4216e4-2e68-4aff-8fec-7de17189c27c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:34:38 compute-0 nova_compute[189265]: 2025-09-30 07:34:38.897 2 DEBUG nova.network.os_vif_util [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d6:c0:ac,bridge_name='br-int',has_traffic_filtering=True,id=bf4216e4-2e68-4aff-8fec-7de17189c27c,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbf4216e4-2e') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:34:38 compute-0 nova_compute[189265]: 2025-09-30 07:34:38.897 2 DEBUG os_vif [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d6:c0:ac,bridge_name='br-int',has_traffic_filtering=True,id=bf4216e4-2e68-4aff-8fec-7de17189c27c,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbf4216e4-2e') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 07:34:38 compute-0 nova_compute[189265]: 2025-09-30 07:34:38.898 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:34:38 compute-0 nova_compute[189265]: 2025-09-30 07:34:38.899 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:34:38 compute-0 nova_compute[189265]: 2025-09-30 07:34:38.899 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:34:38 compute-0 nova_compute[189265]: 2025-09-30 07:34:38.900 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:34:38 compute-0 nova_compute[189265]: 2025-09-30 07:34:38.901 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '6cb336d6-36ff-5ca9-95ee-67da0bf536db', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:34:38 compute-0 nova_compute[189265]: 2025-09-30 07:34:38.903 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:34:38 compute-0 nova_compute[189265]: 2025-09-30 07:34:38.906 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:34:38 compute-0 nova_compute[189265]: 2025-09-30 07:34:38.909 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:34:38 compute-0 nova_compute[189265]: 2025-09-30 07:34:38.909 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbf4216e4-2e, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:34:38 compute-0 nova_compute[189265]: 2025-09-30 07:34:38.910 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapbf4216e4-2e, col_values=(('qos', UUID('0c9ede59-cb28-4b47-bca5-09ec14d5dbfc')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:34:38 compute-0 nova_compute[189265]: 2025-09-30 07:34:38.910 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapbf4216e4-2e, col_values=(('external_ids', {'iface-id': 'bf4216e4-2e68-4aff-8fec-7de17189c27c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d6:c0:ac', 'vm-uuid': '396e11ed-839f-4c1e-be94-410ca9634b50'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:34:38 compute-0 NetworkManager[51813]: <info>  [1759217678.9130] manager: (tapbf4216e4-2e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/70)
Sep 30 07:34:38 compute-0 nova_compute[189265]: 2025-09-30 07:34:38.912 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:34:38 compute-0 nova_compute[189265]: 2025-09-30 07:34:38.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:34:38 compute-0 nova_compute[189265]: 2025-09-30 07:34:38.921 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:34:38 compute-0 nova_compute[189265]: 2025-09-30 07:34:38.923 2 INFO os_vif [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d6:c0:ac,bridge_name='br-int',has_traffic_filtering=True,id=bf4216e4-2e68-4aff-8fec-7de17189c27c,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbf4216e4-2e')
Sep 30 07:34:39 compute-0 nova_compute[189265]: 2025-09-30 07:34:39.041 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:34:40 compute-0 nova_compute[189265]: 2025-09-30 07:34:40.507 2 DEBUG nova.virt.libvirt.driver [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 07:34:40 compute-0 nova_compute[189265]: 2025-09-30 07:34:40.507 2 DEBUG nova.virt.libvirt.driver [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 07:34:40 compute-0 nova_compute[189265]: 2025-09-30 07:34:40.508 2 DEBUG nova.virt.libvirt.driver [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] No VIF found with MAC fa:16:3e:d6:c0:ac, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 07:34:40 compute-0 nova_compute[189265]: 2025-09-30 07:34:40.508 2 INFO nova.virt.libvirt.driver [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Using config drive
Sep 30 07:34:41 compute-0 nova_compute[189265]: 2025-09-30 07:34:41.032 2 WARNING neutronclient.v2_0.client [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:34:41 compute-0 nova_compute[189265]: 2025-09-30 07:34:41.934 2 INFO nova.virt.libvirt.driver [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Creating config drive at /var/lib/nova/instances/396e11ed-839f-4c1e-be94-410ca9634b50/disk.config
Sep 30 07:34:41 compute-0 nova_compute[189265]: 2025-09-30 07:34:41.945 2 DEBUG oslo_concurrency.processutils [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/396e11ed-839f-4c1e-be94-410ca9634b50/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpen_ypbiq execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:34:42 compute-0 nova_compute[189265]: 2025-09-30 07:34:42.088 2 DEBUG oslo_concurrency.processutils [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/396e11ed-839f-4c1e-be94-410ca9634b50/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpen_ypbiq" returned: 0 in 0.143s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:34:42 compute-0 kernel: tapbf4216e4-2e: entered promiscuous mode
Sep 30 07:34:42 compute-0 NetworkManager[51813]: <info>  [1759217682.1761] manager: (tapbf4216e4-2e): new Tun device (/org/freedesktop/NetworkManager/Devices/71)
Sep 30 07:34:42 compute-0 nova_compute[189265]: 2025-09-30 07:34:42.176 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:34:42 compute-0 ovn_controller[91436]: 2025-09-30T07:34:42Z|00199|binding|INFO|Claiming lport bf4216e4-2e68-4aff-8fec-7de17189c27c for this chassis.
Sep 30 07:34:42 compute-0 ovn_controller[91436]: 2025-09-30T07:34:42Z|00200|binding|INFO|bf4216e4-2e68-4aff-8fec-7de17189c27c: Claiming fa:16:3e:d6:c0:ac 10.100.0.5
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:34:42.190 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d6:c0:ac 10.100.0.5'], port_security=['fa:16:3e:d6:c0:ac 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '396e11ed-839f-4c1e-be94-410ca9634b50', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c99c822b-3191-49e5-b938-903e25b4a9bb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6431607f3dce4c88bbf6d17ee6cd45b2', 'neutron:revision_number': '4', 'neutron:security_group_ids': '39e9818d-6ede-4a3d-b6e2-a5ad3a4c803a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0bbcb02d-e040-4e0e-9a60-6466c4420133, chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], logical_port=bf4216e4-2e68-4aff-8fec-7de17189c27c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:34:42.192 100322 INFO neutron.agent.ovn.metadata.agent [-] Port bf4216e4-2e68-4aff-8fec-7de17189c27c in datapath c99c822b-3191-49e5-b938-903e25b4a9bb bound to our chassis
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:34:42.194 100322 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c99c822b-3191-49e5-b938-903e25b4a9bb
Sep 30 07:34:42 compute-0 ovn_controller[91436]: 2025-09-30T07:34:42Z|00201|binding|INFO|Setting lport bf4216e4-2e68-4aff-8fec-7de17189c27c ovn-installed in OVS
Sep 30 07:34:42 compute-0 ovn_controller[91436]: 2025-09-30T07:34:42Z|00202|binding|INFO|Setting lport bf4216e4-2e68-4aff-8fec-7de17189c27c up in Southbound
Sep 30 07:34:42 compute-0 systemd-udevd[220597]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 07:34:42 compute-0 nova_compute[189265]: 2025-09-30 07:34:42.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:34:42.214 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[6201725b-604f-4f27-a448-618f579cf192]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:34:42.216 100322 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc99c822b-31 in ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:34:42.218 210650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc99c822b-30 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:34:42.218 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[c84e4baa-5938-44b1-8005-8f6e7c90c50a]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:34:42.219 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[e7d67668-edce-47c6-a9d6-9450c751e0dc]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:34:42 compute-0 NetworkManager[51813]: <info>  [1759217682.2281] device (tapbf4216e4-2e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 07:34:42 compute-0 NetworkManager[51813]: <info>  [1759217682.2292] device (tapbf4216e4-2e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:34:42.233 100440 DEBUG oslo.privsep.daemon [-] privsep: reply[84eeced9-0be1-4027-864d-09bd172b6aa4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:34:42 compute-0 systemd-machined[149233]: New machine qemu-16-instance-00000015.
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:34:42.250 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[c55a8ef2-00d2-4069-bdf8-65e848487ec7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:34:42 compute-0 systemd[1]: Started Virtual Machine qemu-16-instance-00000015.
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:34:42.289 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[b8e2841d-b650-44ad-9bc4-bb4ce4891823]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:34:42.294 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[6a8861bc-03a1-48f9-9736-58d2b877bafb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:34:42 compute-0 systemd-udevd[220603]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 07:34:42 compute-0 NetworkManager[51813]: <info>  [1759217682.2971] manager: (tapc99c822b-30): new Veth device (/org/freedesktop/NetworkManager/Devices/72)
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:34:42.339 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[44073eb4-c71e-4346-bf94-dad1516a12f0]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:34:42.343 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[dced5e99-c5a1-4e65-a3cc-7848223f44bd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:34:42 compute-0 NetworkManager[51813]: <info>  [1759217682.3761] device (tapc99c822b-30): carrier: link connected
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:34:42.387 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[322b81a5-8215-4ee2-8a80-702f9af5c3e4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:34:42.411 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[4f7111a2-a904-4ef6-a9dc-79c503dea342]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc99c822b-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:67:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 51], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 554009, 'reachable_time': 30841, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220633, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:34:42.433 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[c4eae50e-0b3c-42e4-a9ba-32c0478b03f1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe09:678c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 554009, 'tstamp': 554009}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220634, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:34:42.458 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[09ebff1a-0a2a-4a12-9ba6-f4b8ca349314]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc99c822b-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:67:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 51], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 554009, 'reachable_time': 30841, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220635, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:34:42.501 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[e3b6946d-7d30-4a33-b9c9-9e51d7493b54]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:34:42.597 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[463b9ab8-b990-4995-aaba-e7f599f762a4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:34:42.600 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc99c822b-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:34:42.601 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:34:42.602 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc99c822b-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:34:42 compute-0 kernel: tapc99c822b-30: entered promiscuous mode
Sep 30 07:34:42 compute-0 NetworkManager[51813]: <info>  [1759217682.6055] manager: (tapc99c822b-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/73)
Sep 30 07:34:42 compute-0 nova_compute[189265]: 2025-09-30 07:34:42.605 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:34:42 compute-0 nova_compute[189265]: 2025-09-30 07:34:42.608 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:34:42.609 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc99c822b-30, col_values=(('external_ids', {'iface-id': '67b7df48-3f38-444a-8506-1c0ec5bd1d15'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:34:42 compute-0 nova_compute[189265]: 2025-09-30 07:34:42.610 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:34:42 compute-0 ovn_controller[91436]: 2025-09-30T07:34:42Z|00203|binding|INFO|Releasing lport 67b7df48-3f38-444a-8506-1c0ec5bd1d15 from this chassis (sb_readonly=0)
Sep 30 07:34:42 compute-0 nova_compute[189265]: 2025-09-30 07:34:42.668 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:34:42.670 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[fdf19f36-23a2-4713-8e16-6e5be3000bb4]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:34:42.672 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:34:42.672 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:34:42.672 100322 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for c99c822b-3191-49e5-b938-903e25b4a9bb disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:34:42.672 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:34:42.673 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[ecf4f7a0-4b65-4839-a4ae-3e5e063f371f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:34:42.674 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:34:42.675 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[ffcb7ada-ab7b-4c38-ba4c-5970eba514e6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:34:42.676 100322 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]: global
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]:     log         /dev/log local0 debug
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]:     log-tag     haproxy-metadata-proxy-c99c822b-3191-49e5-b938-903e25b4a9bb
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]:     user        root
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]:     group       root
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]:     maxconn     1024
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]:     pidfile     /var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]:     daemon
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]: 
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]: defaults
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]:     log global
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]:     mode http
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]:     option httplog
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]:     option dontlognull
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]:     option http-server-close
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]:     option forwardfor
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]:     retries                 3
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]:     timeout http-request    30s
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]:     timeout connect         30s
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]:     timeout client          32s
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]:     timeout server          32s
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]:     timeout http-keep-alive 30s
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]: 
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]: listen listener
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]:     bind 169.254.169.254:80
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]:     
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]: 
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]:     http-request add-header X-OVN-Network-ID c99c822b-3191-49e5-b938-903e25b4a9bb
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Sep 30 07:34:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:34:42.677 100322 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'env', 'PROCESS_TAG=haproxy-c99c822b-3191-49e5-b938-903e25b4a9bb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c99c822b-3191-49e5-b938-903e25b4a9bb.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Sep 30 07:34:42 compute-0 nova_compute[189265]: 2025-09-30 07:34:42.940 2 DEBUG nova.compute.manager [req-de575105-3a50-4026-ba87-8df0723d5fbc req-bdd8fec0-e918-4d72-8570-0738d49e5d7c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Received event network-vif-plugged-bf4216e4-2e68-4aff-8fec-7de17189c27c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:34:42 compute-0 nova_compute[189265]: 2025-09-30 07:34:42.941 2 DEBUG oslo_concurrency.lockutils [req-de575105-3a50-4026-ba87-8df0723d5fbc req-bdd8fec0-e918-4d72-8570-0738d49e5d7c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "396e11ed-839f-4c1e-be94-410ca9634b50-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:34:42 compute-0 nova_compute[189265]: 2025-09-30 07:34:42.941 2 DEBUG oslo_concurrency.lockutils [req-de575105-3a50-4026-ba87-8df0723d5fbc req-bdd8fec0-e918-4d72-8570-0738d49e5d7c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "396e11ed-839f-4c1e-be94-410ca9634b50-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:34:42 compute-0 nova_compute[189265]: 2025-09-30 07:34:42.941 2 DEBUG oslo_concurrency.lockutils [req-de575105-3a50-4026-ba87-8df0723d5fbc req-bdd8fec0-e918-4d72-8570-0738d49e5d7c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "396e11ed-839f-4c1e-be94-410ca9634b50-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:34:42 compute-0 nova_compute[189265]: 2025-09-30 07:34:42.941 2 DEBUG nova.compute.manager [req-de575105-3a50-4026-ba87-8df0723d5fbc req-bdd8fec0-e918-4d72-8570-0738d49e5d7c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Processing event network-vif-plugged-bf4216e4-2e68-4aff-8fec-7de17189c27c _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 07:34:42 compute-0 nova_compute[189265]: 2025-09-30 07:34:42.960 2 DEBUG nova.compute.manager [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 07:34:42 compute-0 nova_compute[189265]: 2025-09-30 07:34:42.966 2 DEBUG nova.virt.libvirt.driver [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Sep 30 07:34:42 compute-0 nova_compute[189265]: 2025-09-30 07:34:42.969 2 INFO nova.virt.libvirt.driver [-] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Instance spawned successfully.
Sep 30 07:34:42 compute-0 nova_compute[189265]: 2025-09-30 07:34:42.969 2 DEBUG nova.virt.libvirt.driver [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Sep 30 07:34:43 compute-0 podman[220674]: 2025-09-30 07:34:43.105469927 +0000 UTC m=+0.054766139 container create eb0898251616c836e52e1cfab38621e33c8a5da14729f28618888012a196bf70 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb, tcib_managed=true, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Sep 30 07:34:43 compute-0 systemd[1]: Started libpod-conmon-eb0898251616c836e52e1cfab38621e33c8a5da14729f28618888012a196bf70.scope.
Sep 30 07:34:43 compute-0 systemd[1]: Started libcrun container.
Sep 30 07:34:43 compute-0 podman[220674]: 2025-09-30 07:34:43.075532334 +0000 UTC m=+0.024828586 image pull eeebcc09bc72f81ab45f5ab87eb8f6a7b554b949227aeec082bdb0732754ddc8 38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 07:34:43 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ac26c584db9f778b3c1f91dd38a68566debe66fd6bf6559b736cdd6f3439f20/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 07:34:43 compute-0 podman[220674]: 2025-09-30 07:34:43.206132796 +0000 UTC m=+0.155429038 container init eb0898251616c836e52e1cfab38621e33c8a5da14729f28618888012a196bf70 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team)
Sep 30 07:34:43 compute-0 podman[220674]: 2025-09-30 07:34:43.214490766 +0000 UTC m=+0.163786978 container start eb0898251616c836e52e1cfab38621e33c8a5da14729f28618888012a196bf70 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930)
Sep 30 07:34:43 compute-0 neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb[220689]: [NOTICE]   (220693) : New worker (220695) forked
Sep 30 07:34:43 compute-0 neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb[220689]: [NOTICE]   (220693) : Loading success.
Sep 30 07:34:43 compute-0 nova_compute[189265]: 2025-09-30 07:34:43.293 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:34:43 compute-0 nova_compute[189265]: 2025-09-30 07:34:43.480 2 DEBUG nova.virt.libvirt.driver [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:34:43 compute-0 nova_compute[189265]: 2025-09-30 07:34:43.481 2 DEBUG nova.virt.libvirt.driver [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:34:43 compute-0 nova_compute[189265]: 2025-09-30 07:34:43.482 2 DEBUG nova.virt.libvirt.driver [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:34:43 compute-0 nova_compute[189265]: 2025-09-30 07:34:43.482 2 DEBUG nova.virt.libvirt.driver [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:34:43 compute-0 nova_compute[189265]: 2025-09-30 07:34:43.482 2 DEBUG nova.virt.libvirt.driver [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:34:43 compute-0 nova_compute[189265]: 2025-09-30 07:34:43.483 2 DEBUG nova.virt.libvirt.driver [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:34:43 compute-0 nova_compute[189265]: 2025-09-30 07:34:43.914 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:34:43 compute-0 nova_compute[189265]: 2025-09-30 07:34:43.991 2 INFO nova.compute.manager [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Took 10.33 seconds to spawn the instance on the hypervisor.
Sep 30 07:34:43 compute-0 nova_compute[189265]: 2025-09-30 07:34:43.993 2 DEBUG nova.compute.manager [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 07:34:44 compute-0 nova_compute[189265]: 2025-09-30 07:34:44.043 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:34:44 compute-0 podman[220704]: 2025-09-30 07:34:44.506213607 +0000 UTC m=+0.082606770 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, container_name=iscsid, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 07:34:44 compute-0 nova_compute[189265]: 2025-09-30 07:34:44.530 2 INFO nova.compute.manager [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Took 15.78 seconds to build instance.
Sep 30 07:34:45 compute-0 nova_compute[189265]: 2025-09-30 07:34:45.024 2 DEBUG nova.compute.manager [req-b9ea2853-c8ae-4561-9d8b-b9e0c996a0d7 req-0a8c05a3-e38d-4855-b786-1731c78e9149 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Received event network-vif-plugged-bf4216e4-2e68-4aff-8fec-7de17189c27c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:34:45 compute-0 nova_compute[189265]: 2025-09-30 07:34:45.025 2 DEBUG oslo_concurrency.lockutils [req-b9ea2853-c8ae-4561-9d8b-b9e0c996a0d7 req-0a8c05a3-e38d-4855-b786-1731c78e9149 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "396e11ed-839f-4c1e-be94-410ca9634b50-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:34:45 compute-0 nova_compute[189265]: 2025-09-30 07:34:45.026 2 DEBUG oslo_concurrency.lockutils [req-b9ea2853-c8ae-4561-9d8b-b9e0c996a0d7 req-0a8c05a3-e38d-4855-b786-1731c78e9149 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "396e11ed-839f-4c1e-be94-410ca9634b50-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:34:45 compute-0 nova_compute[189265]: 2025-09-30 07:34:45.026 2 DEBUG oslo_concurrency.lockutils [req-b9ea2853-c8ae-4561-9d8b-b9e0c996a0d7 req-0a8c05a3-e38d-4855-b786-1731c78e9149 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "396e11ed-839f-4c1e-be94-410ca9634b50-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:34:45 compute-0 nova_compute[189265]: 2025-09-30 07:34:45.027 2 DEBUG nova.compute.manager [req-b9ea2853-c8ae-4561-9d8b-b9e0c996a0d7 req-0a8c05a3-e38d-4855-b786-1731c78e9149 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] No waiting events found dispatching network-vif-plugged-bf4216e4-2e68-4aff-8fec-7de17189c27c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:34:45 compute-0 nova_compute[189265]: 2025-09-30 07:34:45.027 2 WARNING nova.compute.manager [req-b9ea2853-c8ae-4561-9d8b-b9e0c996a0d7 req-0a8c05a3-e38d-4855-b786-1731c78e9149 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Received unexpected event network-vif-plugged-bf4216e4-2e68-4aff-8fec-7de17189c27c for instance with vm_state active and task_state None.
Sep 30 07:34:45 compute-0 nova_compute[189265]: 2025-09-30 07:34:45.035 2 DEBUG oslo_concurrency.lockutils [None req-0aafe029-7857-499c-9ba9-31ff960618f7 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "396e11ed-839f-4c1e-be94-410ca9634b50" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.342s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:34:45 compute-0 nova_compute[189265]: 2025-09-30 07:34:45.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:34:45 compute-0 nova_compute[189265]: 2025-09-30 07:34:45.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:34:47 compute-0 nova_compute[189265]: 2025-09-30 07:34:47.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:34:47 compute-0 nova_compute[189265]: 2025-09-30 07:34:47.789 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:34:48 compute-0 podman[220724]: 2025-09-30 07:34:48.499050376 +0000 UTC m=+0.073215340 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-type=git, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, build-date=2025-08-20T13:12:41, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 07:34:48 compute-0 nova_compute[189265]: 2025-09-30 07:34:48.952 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:34:49 compute-0 nova_compute[189265]: 2025-09-30 07:34:49.045 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:34:52 compute-0 podman[220745]: 2025-09-30 07:34:52.487674134 +0000 UTC m=+0.066357672 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=multipathd, tcib_build_tag=watcher_latest, config_id=multipathd, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Sep 30 07:34:52 compute-0 podman[220746]: 2025-09-30 07:34:52.511270473 +0000 UTC m=+0.088638913 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.4)
Sep 30 07:34:52 compute-0 podman[220747]: 2025-09-30 07:34:52.528645624 +0000 UTC m=+0.096354726 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, managed_by=edpm_ansible)
Sep 30 07:34:53 compute-0 nova_compute[189265]: 2025-09-30 07:34:53.954 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:34:54 compute-0 nova_compute[189265]: 2025-09-30 07:34:54.049 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:34:54 compute-0 nova_compute[189265]: 2025-09-30 07:34:54.790 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:34:55 compute-0 ovn_controller[91436]: 2025-09-30T07:34:55Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d6:c0:ac 10.100.0.5
Sep 30 07:34:55 compute-0 ovn_controller[91436]: 2025-09-30T07:34:55Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d6:c0:ac 10.100.0.5
Sep 30 07:34:55 compute-0 nova_compute[189265]: 2025-09-30 07:34:55.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:34:55 compute-0 nova_compute[189265]: 2025-09-30 07:34:55.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:34:56 compute-0 nova_compute[189265]: 2025-09-30 07:34:56.314 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:34:56 compute-0 nova_compute[189265]: 2025-09-30 07:34:56.315 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:34:56 compute-0 nova_compute[189265]: 2025-09-30 07:34:56.315 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:34:56 compute-0 nova_compute[189265]: 2025-09-30 07:34:56.315 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:34:57 compute-0 nova_compute[189265]: 2025-09-30 07:34:57.372 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/396e11ed-839f-4c1e-be94-410ca9634b50/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:34:57 compute-0 nova_compute[189265]: 2025-09-30 07:34:57.454 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/396e11ed-839f-4c1e-be94-410ca9634b50/disk --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:34:57 compute-0 nova_compute[189265]: 2025-09-30 07:34:57.457 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/396e11ed-839f-4c1e-be94-410ca9634b50/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:34:57 compute-0 nova_compute[189265]: 2025-09-30 07:34:57.508 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/396e11ed-839f-4c1e-be94-410ca9634b50/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:34:57 compute-0 nova_compute[189265]: 2025-09-30 07:34:57.748 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:34:57 compute-0 nova_compute[189265]: 2025-09-30 07:34:57.750 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:34:57 compute-0 nova_compute[189265]: 2025-09-30 07:34:57.781 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.032s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:34:57 compute-0 nova_compute[189265]: 2025-09-30 07:34:57.783 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5674MB free_disk=73.27515029907227GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:34:57 compute-0 nova_compute[189265]: 2025-09-30 07:34:57.783 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:34:57 compute-0 nova_compute[189265]: 2025-09-30 07:34:57.784 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:34:58 compute-0 nova_compute[189265]: 2025-09-30 07:34:58.903 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Instance 396e11ed-839f-4c1e-be94-410ca9634b50 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 07:34:58 compute-0 nova_compute[189265]: 2025-09-30 07:34:58.903 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:34:58 compute-0 nova_compute[189265]: 2025-09-30 07:34:58.903 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:34:57 up  1:32,  0 user,  load average: 0.60, 0.30, 0.30\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_6431607f3dce4c88bbf6d17ee6cd45b2': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:34:58 compute-0 nova_compute[189265]: 2025-09-30 07:34:58.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:34:58 compute-0 nova_compute[189265]: 2025-09-30 07:34:58.996 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Refreshing inventories for resource provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Sep 30 07:34:59 compute-0 nova_compute[189265]: 2025-09-30 07:34:59.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:34:59 compute-0 nova_compute[189265]: 2025-09-30 07:34:59.064 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Updating ProviderTree inventory for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Sep 30 07:34:59 compute-0 nova_compute[189265]: 2025-09-30 07:34:59.064 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Updating inventory in ProviderTree for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Sep 30 07:34:59 compute-0 nova_compute[189265]: 2025-09-30 07:34:59.076 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Refreshing aggregate associations for resource provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Sep 30 07:34:59 compute-0 nova_compute[189265]: 2025-09-30 07:34:59.112 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Refreshing trait associations for resource provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc, traits: COMPUTE_SECURITY_TPM_CRB,HW_ARCH_X86_64,HW_CPU_X86_F16C,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AESNI,COMPUTE_STORAGE_VIRTIO_FS,HW_CPU_X86_FMA3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE,COMPUTE_ACCELERATORS,HW_CPU_X86_SVM,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_EXTEND,COMPUTE_ARCH_X86_64,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SHA,HW_CPU_X86_BMI,COMPUTE_SOUND_MODEL_USB,COMPUTE_SOUND_MODEL_SB16,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AMD_SVM,HW_CPU_X86_BMI2,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_TIS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_AVX,COMPUTE_SOUND_MODEL_AC97,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_ABM,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_VIF_MODEL_IGB,HW_CPU_X86_SSE41,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SOUND_MODEL_ICH6,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_MMX,HW_CPU_X86_SSE4A,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_PCSPK,HW_CPU_X86_CLMUL _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Sep 30 07:34:59 compute-0 nova_compute[189265]: 2025-09-30 07:34:59.166 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:34:59 compute-0 nova_compute[189265]: 2025-09-30 07:34:59.675 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:34:59 compute-0 podman[199733]: time="2025-09-30T07:34:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:34:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:34:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 07:34:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:34:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3474 "" "Go-http-client/1.1"
Sep 30 07:35:00 compute-0 nova_compute[189265]: 2025-09-30 07:35:00.185 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:35:00 compute-0 nova_compute[189265]: 2025-09-30 07:35:00.186 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.402s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:35:01 compute-0 openstack_network_exporter[201859]: ERROR   07:35:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:35:01 compute-0 openstack_network_exporter[201859]: ERROR   07:35:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:35:01 compute-0 openstack_network_exporter[201859]: ERROR   07:35:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:35:01 compute-0 openstack_network_exporter[201859]: ERROR   07:35:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:35:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:35:01 compute-0 openstack_network_exporter[201859]: ERROR   07:35:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:35:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:35:03 compute-0 nova_compute[189265]: 2025-09-30 07:35:03.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:35:04 compute-0 nova_compute[189265]: 2025-09-30 07:35:04.053 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:35:05 compute-0 nova_compute[189265]: 2025-09-30 07:35:05.186 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:35:06 compute-0 podman[220827]: 2025-09-30 07:35:06.486717389 +0000 UTC m=+0.070733817 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 07:35:09 compute-0 nova_compute[189265]: 2025-09-30 07:35:09.038 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:35:09 compute-0 nova_compute[189265]: 2025-09-30 07:35:09.056 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:35:09 compute-0 nova_compute[189265]: 2025-09-30 07:35:09.361 2 DEBUG nova.compute.manager [None req-e608b6ca-a4fc-422e-8a57-af4308d55634 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Adding trait COMPUTE_STATUS_DISABLED to compute node resource provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:635
Sep 30 07:35:09 compute-0 nova_compute[189265]: 2025-09-30 07:35:09.443 2 DEBUG nova.compute.provider_tree [None req-e608b6ca-a4fc-422e-8a57-af4308d55634 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Updating resource provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc generation from 27 to 28 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Sep 30 07:35:12 compute-0 ovn_controller[91436]: 2025-09-30T07:35:12Z|00204|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Sep 30 07:35:14 compute-0 nova_compute[189265]: 2025-09-30 07:35:14.041 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:35:14 compute-0 nova_compute[189265]: 2025-09-30 07:35:14.057 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:35:15 compute-0 podman[220851]: 2025-09-30 07:35:15.514237913 +0000 UTC m=+0.080224621 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=iscsid, org.label-schema.build-date=20250930)
Sep 30 07:35:16 compute-0 nova_compute[189265]: 2025-09-30 07:35:16.693 2 DEBUG nova.virt.libvirt.driver [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Check if temp file /var/lib/nova/instances/tmp4aa4rtau exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10968
Sep 30 07:35:16 compute-0 nova_compute[189265]: 2025-09-30 07:35:16.699 2 DEBUG nova.compute.manager [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4aa4rtau',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='396e11ed-839f-4c1e-be94-410ca9634b50',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.12/site-packages/nova/compute/manager.py:9294
Sep 30 07:35:18 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Sep 30 07:35:19 compute-0 nova_compute[189265]: 2025-09-30 07:35:19.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:35:19 compute-0 nova_compute[189265]: 2025-09-30 07:35:19.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:35:19 compute-0 podman[220872]: 2025-09-30 07:35:19.559332438 +0000 UTC m=+0.130294833 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=edpm_ansible, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vendor=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, distribution-scope=public, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container)
Sep 30 07:35:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:35:20.569 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:35:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:35:20.570 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:35:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:35:20.570 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:35:21 compute-0 nova_compute[189265]: 2025-09-30 07:35:21.221 2 DEBUG oslo_concurrency.processutils [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/396e11ed-839f-4c1e-be94-410ca9634b50/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:35:21 compute-0 nova_compute[189265]: 2025-09-30 07:35:21.303 2 DEBUG oslo_concurrency.processutils [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/396e11ed-839f-4c1e-be94-410ca9634b50/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:35:21 compute-0 nova_compute[189265]: 2025-09-30 07:35:21.305 2 DEBUG oslo_concurrency.processutils [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/396e11ed-839f-4c1e-be94-410ca9634b50/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:35:21 compute-0 nova_compute[189265]: 2025-09-30 07:35:21.401 2 DEBUG oslo_concurrency.processutils [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/396e11ed-839f-4c1e-be94-410ca9634b50/disk --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:35:21 compute-0 nova_compute[189265]: 2025-09-30 07:35:21.403 2 DEBUG nova.compute.manager [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Preparing to wait for external event network-vif-plugged-bf4216e4-2e68-4aff-8fec-7de17189c27c prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 07:35:21 compute-0 nova_compute[189265]: 2025-09-30 07:35:21.404 2 DEBUG oslo_concurrency.lockutils [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "396e11ed-839f-4c1e-be94-410ca9634b50-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:35:21 compute-0 nova_compute[189265]: 2025-09-30 07:35:21.404 2 DEBUG oslo_concurrency.lockutils [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "396e11ed-839f-4c1e-be94-410ca9634b50-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:35:21 compute-0 nova_compute[189265]: 2025-09-30 07:35:21.405 2 DEBUG oslo_concurrency.lockutils [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "396e11ed-839f-4c1e-be94-410ca9634b50-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:35:23 compute-0 podman[220901]: 2025-09-30 07:35:23.522651307 +0000 UTC m=+0.087210073 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Sep 30 07:35:23 compute-0 podman[220900]: 2025-09-30 07:35:23.5363164 +0000 UTC m=+0.105431767 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS)
Sep 30 07:35:23 compute-0 podman[220902]: 2025-09-30 07:35:23.590634094 +0000 UTC m=+0.148923049 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible)
Sep 30 07:35:24 compute-0 nova_compute[189265]: 2025-09-30 07:35:24.047 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:35:24 compute-0 nova_compute[189265]: 2025-09-30 07:35:24.062 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:35:27 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:35:27.086 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '1a:26:7c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2e:60:fa:91:d0:34'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:35:27 compute-0 nova_compute[189265]: 2025-09-30 07:35:27.087 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:35:27 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:35:27.088 100322 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 07:35:27 compute-0 nova_compute[189265]: 2025-09-30 07:35:27.122 2 DEBUG nova.compute.manager [req-09f4fbb9-6ce4-4616-b595-12032c2c83e6 req-24956cf6-b81d-4001-bd3e-49da6ae660bb 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Received event network-vif-unplugged-bf4216e4-2e68-4aff-8fec-7de17189c27c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:35:27 compute-0 nova_compute[189265]: 2025-09-30 07:35:27.123 2 DEBUG oslo_concurrency.lockutils [req-09f4fbb9-6ce4-4616-b595-12032c2c83e6 req-24956cf6-b81d-4001-bd3e-49da6ae660bb 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "396e11ed-839f-4c1e-be94-410ca9634b50-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:35:27 compute-0 nova_compute[189265]: 2025-09-30 07:35:27.123 2 DEBUG oslo_concurrency.lockutils [req-09f4fbb9-6ce4-4616-b595-12032c2c83e6 req-24956cf6-b81d-4001-bd3e-49da6ae660bb 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "396e11ed-839f-4c1e-be94-410ca9634b50-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:35:27 compute-0 nova_compute[189265]: 2025-09-30 07:35:27.124 2 DEBUG oslo_concurrency.lockutils [req-09f4fbb9-6ce4-4616-b595-12032c2c83e6 req-24956cf6-b81d-4001-bd3e-49da6ae660bb 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "396e11ed-839f-4c1e-be94-410ca9634b50-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:35:27 compute-0 nova_compute[189265]: 2025-09-30 07:35:27.124 2 DEBUG nova.compute.manager [req-09f4fbb9-6ce4-4616-b595-12032c2c83e6 req-24956cf6-b81d-4001-bd3e-49da6ae660bb 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] No event matching network-vif-unplugged-bf4216e4-2e68-4aff-8fec-7de17189c27c in dict_keys([('network-vif-plugged', 'bf4216e4-2e68-4aff-8fec-7de17189c27c')]) pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:349
Sep 30 07:35:27 compute-0 nova_compute[189265]: 2025-09-30 07:35:27.124 2 DEBUG nova.compute.manager [req-09f4fbb9-6ce4-4616-b595-12032c2c83e6 req-24956cf6-b81d-4001-bd3e-49da6ae660bb 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Received event network-vif-unplugged-bf4216e4-2e68-4aff-8fec-7de17189c27c for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:35:28 compute-0 nova_compute[189265]: 2025-09-30 07:35:28.434 2 INFO nova.compute.manager [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Took 7.03 seconds for pre_live_migration on destination host compute-1.ctlplane.example.com.
Sep 30 07:35:29 compute-0 nova_compute[189265]: 2025-09-30 07:35:29.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:35:29 compute-0 nova_compute[189265]: 2025-09-30 07:35:29.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:35:29 compute-0 nova_compute[189265]: 2025-09-30 07:35:29.222 2 DEBUG nova.compute.manager [req-c0baf670-7254-489b-8ef2-815dda43b18d req-b712ac44-a404-4b42-9a42-6d604c54821b 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Received event network-vif-plugged-bf4216e4-2e68-4aff-8fec-7de17189c27c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:35:29 compute-0 nova_compute[189265]: 2025-09-30 07:35:29.222 2 DEBUG oslo_concurrency.lockutils [req-c0baf670-7254-489b-8ef2-815dda43b18d req-b712ac44-a404-4b42-9a42-6d604c54821b 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "396e11ed-839f-4c1e-be94-410ca9634b50-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:35:29 compute-0 nova_compute[189265]: 2025-09-30 07:35:29.223 2 DEBUG oslo_concurrency.lockutils [req-c0baf670-7254-489b-8ef2-815dda43b18d req-b712ac44-a404-4b42-9a42-6d604c54821b 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "396e11ed-839f-4c1e-be94-410ca9634b50-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:35:29 compute-0 nova_compute[189265]: 2025-09-30 07:35:29.223 2 DEBUG oslo_concurrency.lockutils [req-c0baf670-7254-489b-8ef2-815dda43b18d req-b712ac44-a404-4b42-9a42-6d604c54821b 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "396e11ed-839f-4c1e-be94-410ca9634b50-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:35:29 compute-0 nova_compute[189265]: 2025-09-30 07:35:29.224 2 DEBUG nova.compute.manager [req-c0baf670-7254-489b-8ef2-815dda43b18d req-b712ac44-a404-4b42-9a42-6d604c54821b 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Processing event network-vif-plugged-bf4216e4-2e68-4aff-8fec-7de17189c27c _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 07:35:29 compute-0 nova_compute[189265]: 2025-09-30 07:35:29.224 2 DEBUG nova.compute.manager [req-c0baf670-7254-489b-8ef2-815dda43b18d req-b712ac44-a404-4b42-9a42-6d604c54821b 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Received event network-changed-bf4216e4-2e68-4aff-8fec-7de17189c27c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:35:29 compute-0 nova_compute[189265]: 2025-09-30 07:35:29.225 2 DEBUG nova.compute.manager [req-c0baf670-7254-489b-8ef2-815dda43b18d req-b712ac44-a404-4b42-9a42-6d604c54821b 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Refreshing instance network info cache due to event network-changed-bf4216e4-2e68-4aff-8fec-7de17189c27c. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 07:35:29 compute-0 nova_compute[189265]: 2025-09-30 07:35:29.225 2 DEBUG oslo_concurrency.lockutils [req-c0baf670-7254-489b-8ef2-815dda43b18d req-b712ac44-a404-4b42-9a42-6d604c54821b 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "refresh_cache-396e11ed-839f-4c1e-be94-410ca9634b50" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:35:29 compute-0 nova_compute[189265]: 2025-09-30 07:35:29.225 2 DEBUG oslo_concurrency.lockutils [req-c0baf670-7254-489b-8ef2-815dda43b18d req-b712ac44-a404-4b42-9a42-6d604c54821b 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquired lock "refresh_cache-396e11ed-839f-4c1e-be94-410ca9634b50" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:35:29 compute-0 nova_compute[189265]: 2025-09-30 07:35:29.226 2 DEBUG nova.network.neutron [req-c0baf670-7254-489b-8ef2-815dda43b18d req-b712ac44-a404-4b42-9a42-6d604c54821b 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Refreshing network info cache for port bf4216e4-2e68-4aff-8fec-7de17189c27c _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 07:35:29 compute-0 nova_compute[189265]: 2025-09-30 07:35:29.227 2 DEBUG nova.compute.manager [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 07:35:29 compute-0 nova_compute[189265]: 2025-09-30 07:35:29.735 2 WARNING neutronclient.v2_0.client [req-c0baf670-7254-489b-8ef2-815dda43b18d req-b712ac44-a404-4b42-9a42-6d604c54821b 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:35:29 compute-0 nova_compute[189265]: 2025-09-30 07:35:29.739 2 DEBUG nova.compute.manager [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4aa4rtau',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='396e11ed-839f-4c1e-be94-410ca9634b50',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(a52c5996-3eca-46c5-a9b4-2623668d7600),old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9659
Sep 30 07:35:29 compute-0 podman[199733]: time="2025-09-30T07:35:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:35:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:35:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 07:35:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:35:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3479 "" "Go-http-client/1.1"
Sep 30 07:35:30 compute-0 nova_compute[189265]: 2025-09-30 07:35:30.301 2 DEBUG nova.objects.instance [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lazy-loading 'migration_context' on Instance uuid 396e11ed-839f-4c1e-be94-410ca9634b50 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:35:30 compute-0 nova_compute[189265]: 2025-09-30 07:35:30.303 2 DEBUG nova.virt.libvirt.driver [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Starting monitoring of live migration _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11543
Sep 30 07:35:30 compute-0 nova_compute[189265]: 2025-09-30 07:35:30.304 2 DEBUG nova.virt.libvirt.driver [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Sep 30 07:35:30 compute-0 nova_compute[189265]: 2025-09-30 07:35:30.304 2 DEBUG nova.virt.libvirt.driver [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Sep 30 07:35:30 compute-0 nova_compute[189265]: 2025-09-30 07:35:30.806 2 DEBUG nova.virt.libvirt.driver [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Operation thread is still running _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11343
Sep 30 07:35:30 compute-0 nova_compute[189265]: 2025-09-30 07:35:30.807 2 DEBUG nova.virt.libvirt.driver [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Migration not running yet _live_migration_monitor /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11352
Sep 30 07:35:30 compute-0 nova_compute[189265]: 2025-09-30 07:35:30.826 2 DEBUG nova.virt.libvirt.vif [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-09-30T07:34:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-314749624',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-314749624',id=21,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T07:34:43Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6431607f3dce4c88bbf6d17ee6cd45b2',ramdisk_id='',reservation_id='r-qh7t6g2f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1096120513',owner_user_name='tempest-TestExecuteStrategies-1096120513-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T07:34:44Z,user_data=None,user_id='89ba5d19014145188ad2a3c812acdc88',uuid=396e11ed-839f-4c1e-be94-410ca9634b50,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bf4216e4-2e68-4aff-8fec-7de17189c27c", "address": "fa:16:3e:d6:c0:ac", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapbf4216e4-2e", "ovs_interfaceid": "bf4216e4-2e68-4aff-8fec-7de17189c27c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 07:35:30 compute-0 nova_compute[189265]: 2025-09-30 07:35:30.827 2 DEBUG nova.network.os_vif_util [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Converting VIF {"id": "bf4216e4-2e68-4aff-8fec-7de17189c27c", "address": "fa:16:3e:d6:c0:ac", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapbf4216e4-2e", "ovs_interfaceid": "bf4216e4-2e68-4aff-8fec-7de17189c27c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:35:30 compute-0 nova_compute[189265]: 2025-09-30 07:35:30.828 2 DEBUG nova.network.os_vif_util [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d6:c0:ac,bridge_name='br-int',has_traffic_filtering=True,id=bf4216e4-2e68-4aff-8fec-7de17189c27c,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbf4216e4-2e') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:35:30 compute-0 nova_compute[189265]: 2025-09-30 07:35:30.829 2 DEBUG nova.virt.libvirt.migration [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Updating guest XML with vif config: <interface type="ethernet">
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <mac address="fa:16:3e:d6:c0:ac"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <model type="virtio"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <driver name="vhost" rx_queue_size="512"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <mtu size="1442"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <target dev="tapbf4216e4-2e"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]: </interface>
Sep 30 07:35:30 compute-0 nova_compute[189265]:  _update_vif_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:534
Sep 30 07:35:30 compute-0 nova_compute[189265]: 2025-09-30 07:35:30.830 2 DEBUG nova.virt.libvirt.migration [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] _remove_cpu_shared_set_xml input xml=<domain type="kvm">
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <name>instance-00000015</name>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <uuid>396e11ed-839f-4c1e-be94-410ca9634b50</uuid>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <metadata>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <nova:name>tempest-TestExecuteStrategies-server-314749624</nova:name>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <nova:creationTime>2025-09-30 07:34:38</nova:creationTime>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <nova:flavor name="m1.nano" id="ded17455-f8fe-40c7-8dae-6f0a2b208ae0">
Sep 30 07:35:30 compute-0 nova_compute[189265]:         <nova:memory>128</nova:memory>
Sep 30 07:35:30 compute-0 nova_compute[189265]:         <nova:disk>1</nova:disk>
Sep 30 07:35:30 compute-0 nova_compute[189265]:         <nova:swap>0</nova:swap>
Sep 30 07:35:30 compute-0 nova_compute[189265]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 07:35:30 compute-0 nova_compute[189265]:         <nova:vcpus>1</nova:vcpus>
Sep 30 07:35:30 compute-0 nova_compute[189265]:         <nova:extraSpecs>
Sep 30 07:35:30 compute-0 nova_compute[189265]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 07:35:30 compute-0 nova_compute[189265]:         </nova:extraSpecs>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       </nova:flavor>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <nova:image uuid="0c6b92f5-9861-49e4-862d-3ffd84520dfa">
Sep 30 07:35:30 compute-0 nova_compute[189265]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 07:35:30 compute-0 nova_compute[189265]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 07:35:30 compute-0 nova_compute[189265]:         <nova:minDisk>1</nova:minDisk>
Sep 30 07:35:30 compute-0 nova_compute[189265]:         <nova:minRam>0</nova:minRam>
Sep 30 07:35:30 compute-0 nova_compute[189265]:         <nova:properties>
Sep 30 07:35:30 compute-0 nova_compute[189265]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 07:35:30 compute-0 nova_compute[189265]:         </nova:properties>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       </nova:image>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <nova:owner>
Sep 30 07:35:30 compute-0 nova_compute[189265]:         <nova:user uuid="89ba5d19014145188ad2a3c812acdc88">tempest-TestExecuteStrategies-1096120513-project-admin</nova:user>
Sep 30 07:35:30 compute-0 nova_compute[189265]:         <nova:project uuid="6431607f3dce4c88bbf6d17ee6cd45b2">tempest-TestExecuteStrategies-1096120513</nova:project>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       </nova:owner>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <nova:root type="image" uuid="0c6b92f5-9861-49e4-862d-3ffd84520dfa"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <nova:ports>
Sep 30 07:35:30 compute-0 nova_compute[189265]:         <nova:port uuid="bf4216e4-2e68-4aff-8fec-7de17189c27c">
Sep 30 07:35:30 compute-0 nova_compute[189265]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:         </nova:port>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       </nova:ports>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </nova:instance>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   </metadata>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <memory unit="KiB">131072</memory>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <currentMemory unit="KiB">131072</currentMemory>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <vcpu placement="static">1</vcpu>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <resource>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <partition>/machine</partition>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   </resource>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <sysinfo type="smbios">
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <system>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <entry name="manufacturer">RDO</entry>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <entry name="product">OpenStack Compute</entry>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <entry name="serial">396e11ed-839f-4c1e-be94-410ca9634b50</entry>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <entry name="uuid">396e11ed-839f-4c1e-be94-410ca9634b50</entry>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <entry name="family">Virtual Machine</entry>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </system>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   </sysinfo>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <os>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <boot dev="hd"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <smbios mode="sysinfo"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   </os>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <features>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <acpi/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <apic/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <vmcoreinfo state="on"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   </features>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <cpu mode="host-model" check="partial">
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   </cpu>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <clock offset="utc">
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <timer name="hpet" present="no"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   </clock>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <on_poweroff>destroy</on_poweroff>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <on_reboot>restart</on_reboot>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <on_crash>destroy</on_crash>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <devices>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <disk type="file" device="disk">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <source file="/var/lib/nova/instances/396e11ed-839f-4c1e-be94-410ca9634b50/disk"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target dev="vda" bus="virtio"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </disk>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <disk type="file" device="cdrom">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <source file="/var/lib/nova/instances/396e11ed-839f-4c1e-be94-410ca9634b50/disk.config"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target dev="sda" bus="sata"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <readonly/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </disk>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="1" port="0x10"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="2" port="0x11"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="3" port="0x12"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="4" port="0x13"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="5" port="0x14"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="6" port="0x15"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="7" port="0x16"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="8" port="0x17"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="9" port="0x18"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="10" port="0x19"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="11" port="0x1a"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="12" port="0x1b"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="13" port="0x1c"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="14" port="0x1d"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="15" port="0x1e"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="16" port="0x1f"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="17" port="0x20"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="18" port="0x21"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="19" port="0x22"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="20" port="0x23"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="21" port="0x24"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="22" port="0x25"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="23" port="0x26"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="24" port="0x27"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="25" port="0x28"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-pci-bridge"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="sata" index="0">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <interface type="ethernet"><mac address="fa:16:3e:d6:c0:ac"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapbf4216e4-2e"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </interface><serial type="pty">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <log file="/var/lib/nova/instances/396e11ed-839f-4c1e-be94-410ca9634b50/console.log" append="off"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target type="isa-serial" port="0">
Sep 30 07:35:30 compute-0 nova_compute[189265]:         <model name="isa-serial"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       </target>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </serial>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <console type="pty">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <log file="/var/lib/nova/instances/396e11ed-839f-4c1e-be94-410ca9634b50/console.log" append="off"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target type="serial" port="0"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </console>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <input type="tablet" bus="usb">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="usb" bus="0" port="1"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </input>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <input type="mouse" bus="ps2"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <listen type="address" address="::"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </graphics>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <video>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </video>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <stats period="10"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </memballoon>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <rng model="virtio">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <backend model="random">/dev/urandom</backend>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </rng>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   </devices>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]: </domain>
Sep 30 07:35:30 compute-0 nova_compute[189265]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:241
Sep 30 07:35:30 compute-0 nova_compute[189265]: 2025-09-30 07:35:30.834 2 DEBUG nova.virt.libvirt.migration [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] _remove_cpu_shared_set_xml output xml=<domain type="kvm">
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <name>instance-00000015</name>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <uuid>396e11ed-839f-4c1e-be94-410ca9634b50</uuid>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <metadata>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <nova:name>tempest-TestExecuteStrategies-server-314749624</nova:name>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <nova:creationTime>2025-09-30 07:34:38</nova:creationTime>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <nova:flavor name="m1.nano" id="ded17455-f8fe-40c7-8dae-6f0a2b208ae0">
Sep 30 07:35:30 compute-0 nova_compute[189265]:         <nova:memory>128</nova:memory>
Sep 30 07:35:30 compute-0 nova_compute[189265]:         <nova:disk>1</nova:disk>
Sep 30 07:35:30 compute-0 nova_compute[189265]:         <nova:swap>0</nova:swap>
Sep 30 07:35:30 compute-0 nova_compute[189265]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 07:35:30 compute-0 nova_compute[189265]:         <nova:vcpus>1</nova:vcpus>
Sep 30 07:35:30 compute-0 nova_compute[189265]:         <nova:extraSpecs>
Sep 30 07:35:30 compute-0 nova_compute[189265]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 07:35:30 compute-0 nova_compute[189265]:         </nova:extraSpecs>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       </nova:flavor>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <nova:image uuid="0c6b92f5-9861-49e4-862d-3ffd84520dfa">
Sep 30 07:35:30 compute-0 nova_compute[189265]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 07:35:30 compute-0 nova_compute[189265]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 07:35:30 compute-0 nova_compute[189265]:         <nova:minDisk>1</nova:minDisk>
Sep 30 07:35:30 compute-0 nova_compute[189265]:         <nova:minRam>0</nova:minRam>
Sep 30 07:35:30 compute-0 nova_compute[189265]:         <nova:properties>
Sep 30 07:35:30 compute-0 nova_compute[189265]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 07:35:30 compute-0 nova_compute[189265]:         </nova:properties>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       </nova:image>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <nova:owner>
Sep 30 07:35:30 compute-0 nova_compute[189265]:         <nova:user uuid="89ba5d19014145188ad2a3c812acdc88">tempest-TestExecuteStrategies-1096120513-project-admin</nova:user>
Sep 30 07:35:30 compute-0 nova_compute[189265]:         <nova:project uuid="6431607f3dce4c88bbf6d17ee6cd45b2">tempest-TestExecuteStrategies-1096120513</nova:project>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       </nova:owner>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <nova:root type="image" uuid="0c6b92f5-9861-49e4-862d-3ffd84520dfa"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <nova:ports>
Sep 30 07:35:30 compute-0 nova_compute[189265]:         <nova:port uuid="bf4216e4-2e68-4aff-8fec-7de17189c27c">
Sep 30 07:35:30 compute-0 nova_compute[189265]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:         </nova:port>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       </nova:ports>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </nova:instance>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   </metadata>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <memory unit="KiB">131072</memory>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <currentMemory unit="KiB">131072</currentMemory>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <vcpu placement="static">1</vcpu>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <resource>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <partition>/machine</partition>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   </resource>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <sysinfo type="smbios">
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <system>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <entry name="manufacturer">RDO</entry>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <entry name="product">OpenStack Compute</entry>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <entry name="serial">396e11ed-839f-4c1e-be94-410ca9634b50</entry>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <entry name="uuid">396e11ed-839f-4c1e-be94-410ca9634b50</entry>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <entry name="family">Virtual Machine</entry>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </system>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   </sysinfo>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <os>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <boot dev="hd"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <smbios mode="sysinfo"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   </os>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <features>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <acpi/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <apic/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <vmcoreinfo state="on"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   </features>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <cpu mode="host-model" check="partial">
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   </cpu>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <clock offset="utc">
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <timer name="hpet" present="no"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   </clock>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <on_poweroff>destroy</on_poweroff>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <on_reboot>restart</on_reboot>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <on_crash>destroy</on_crash>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <devices>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <disk type="file" device="disk">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <source file="/var/lib/nova/instances/396e11ed-839f-4c1e-be94-410ca9634b50/disk"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target dev="vda" bus="virtio"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </disk>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <disk type="file" device="cdrom">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <source file="/var/lib/nova/instances/396e11ed-839f-4c1e-be94-410ca9634b50/disk.config"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target dev="sda" bus="sata"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <readonly/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </disk>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="1" port="0x10"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="2" port="0x11"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="3" port="0x12"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="4" port="0x13"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="5" port="0x14"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="6" port="0x15"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="7" port="0x16"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="8" port="0x17"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="9" port="0x18"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="10" port="0x19"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="11" port="0x1a"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="12" port="0x1b"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="13" port="0x1c"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="14" port="0x1d"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="15" port="0x1e"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="16" port="0x1f"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="17" port="0x20"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="18" port="0x21"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="19" port="0x22"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="20" port="0x23"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="21" port="0x24"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="22" port="0x25"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="23" port="0x26"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="24" port="0x27"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="25" port="0x28"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-pci-bridge"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="sata" index="0">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <interface type="ethernet"><mac address="fa:16:3e:d6:c0:ac"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapbf4216e4-2e"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </interface><serial type="pty">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <log file="/var/lib/nova/instances/396e11ed-839f-4c1e-be94-410ca9634b50/console.log" append="off"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target type="isa-serial" port="0">
Sep 30 07:35:30 compute-0 nova_compute[189265]:         <model name="isa-serial"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       </target>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </serial>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <console type="pty">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <log file="/var/lib/nova/instances/396e11ed-839f-4c1e-be94-410ca9634b50/console.log" append="off"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target type="serial" port="0"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </console>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <input type="tablet" bus="usb">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="usb" bus="0" port="1"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </input>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <input type="mouse" bus="ps2"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <listen type="address" address="::"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </graphics>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <video>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </video>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <stats period="10"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </memballoon>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <rng model="virtio">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <backend model="random">/dev/urandom</backend>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </rng>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   </devices>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]: </domain>
Sep 30 07:35:30 compute-0 nova_compute[189265]:  _remove_cpu_shared_set_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:250
Sep 30 07:35:30 compute-0 nova_compute[189265]: 2025-09-30 07:35:30.835 2 DEBUG nova.virt.libvirt.migration [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] _update_pci_xml output xml=<domain type="kvm">
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <name>instance-00000015</name>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <uuid>396e11ed-839f-4c1e-be94-410ca9634b50</uuid>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <metadata>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <nova:name>tempest-TestExecuteStrategies-server-314749624</nova:name>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <nova:creationTime>2025-09-30 07:34:38</nova:creationTime>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <nova:flavor name="m1.nano" id="ded17455-f8fe-40c7-8dae-6f0a2b208ae0">
Sep 30 07:35:30 compute-0 nova_compute[189265]:         <nova:memory>128</nova:memory>
Sep 30 07:35:30 compute-0 nova_compute[189265]:         <nova:disk>1</nova:disk>
Sep 30 07:35:30 compute-0 nova_compute[189265]:         <nova:swap>0</nova:swap>
Sep 30 07:35:30 compute-0 nova_compute[189265]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 07:35:30 compute-0 nova_compute[189265]:         <nova:vcpus>1</nova:vcpus>
Sep 30 07:35:30 compute-0 nova_compute[189265]:         <nova:extraSpecs>
Sep 30 07:35:30 compute-0 nova_compute[189265]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 07:35:30 compute-0 nova_compute[189265]:         </nova:extraSpecs>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       </nova:flavor>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <nova:image uuid="0c6b92f5-9861-49e4-862d-3ffd84520dfa">
Sep 30 07:35:30 compute-0 nova_compute[189265]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 07:35:30 compute-0 nova_compute[189265]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 07:35:30 compute-0 nova_compute[189265]:         <nova:minDisk>1</nova:minDisk>
Sep 30 07:35:30 compute-0 nova_compute[189265]:         <nova:minRam>0</nova:minRam>
Sep 30 07:35:30 compute-0 nova_compute[189265]:         <nova:properties>
Sep 30 07:35:30 compute-0 nova_compute[189265]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 07:35:30 compute-0 nova_compute[189265]:         </nova:properties>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       </nova:image>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <nova:owner>
Sep 30 07:35:30 compute-0 nova_compute[189265]:         <nova:user uuid="89ba5d19014145188ad2a3c812acdc88">tempest-TestExecuteStrategies-1096120513-project-admin</nova:user>
Sep 30 07:35:30 compute-0 nova_compute[189265]:         <nova:project uuid="6431607f3dce4c88bbf6d17ee6cd45b2">tempest-TestExecuteStrategies-1096120513</nova:project>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       </nova:owner>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <nova:root type="image" uuid="0c6b92f5-9861-49e4-862d-3ffd84520dfa"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <nova:ports>
Sep 30 07:35:30 compute-0 nova_compute[189265]:         <nova:port uuid="bf4216e4-2e68-4aff-8fec-7de17189c27c">
Sep 30 07:35:30 compute-0 nova_compute[189265]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:         </nova:port>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       </nova:ports>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </nova:instance>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   </metadata>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <memory unit="KiB">131072</memory>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <currentMemory unit="KiB">131072</currentMemory>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <vcpu placement="static">1</vcpu>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <resource>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <partition>/machine</partition>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   </resource>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <sysinfo type="smbios">
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <system>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <entry name="manufacturer">RDO</entry>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <entry name="product">OpenStack Compute</entry>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <entry name="serial">396e11ed-839f-4c1e-be94-410ca9634b50</entry>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <entry name="uuid">396e11ed-839f-4c1e-be94-410ca9634b50</entry>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <entry name="family">Virtual Machine</entry>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </system>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   </sysinfo>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <os>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <type arch="x86_64" machine="pc-q35-rhel9.6.0">hvm</type>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <boot dev="hd"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <smbios mode="sysinfo"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   </os>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <features>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <acpi/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <apic/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <vmcoreinfo state="on"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   </features>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <cpu mode="host-model" check="partial">
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <topology sockets="1" dies="1" clusters="1" cores="1" threads="1"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   </cpu>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <clock offset="utc">
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <timer name="hpet" present="no"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   </clock>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <on_poweroff>destroy</on_poweroff>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <on_reboot>restart</on_reboot>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <on_crash>destroy</on_crash>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <devices>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <disk type="file" device="disk">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <source file="/var/lib/nova/instances/396e11ed-839f-4c1e-be94-410ca9634b50/disk"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target dev="vda" bus="virtio"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x03" slot="0x00" function="0x0"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </disk>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <disk type="file" device="cdrom">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <source file="/var/lib/nova/instances/396e11ed-839f-4c1e-be94-410ca9634b50/disk.config"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target dev="sda" bus="sata"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <readonly/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="drive" controller="0" bus="0" target="0" unit="0"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </disk>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="0" model="pcie-root"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="1" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="1" port="0x10"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x0" multifunction="on"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="2" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="2" port="0x11"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x1"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="3" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="3" port="0x12"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x2"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="4" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="4" port="0x13"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x3"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="5" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="5" port="0x14"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x4"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="6" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="6" port="0x15"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x5"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="7" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="7" port="0x16"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x6"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="8" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="8" port="0x17"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x02" function="0x7"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="9" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="9" port="0x18"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x0" multifunction="on"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="10" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="10" port="0x19"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x1"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="11" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="11" port="0x1a"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x2"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="12" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="12" port="0x1b"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x3"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="13" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="13" port="0x1c"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x4"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="14" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="14" port="0x1d"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x5"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="15" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="15" port="0x1e"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x6"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="16" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="16" port="0x1f"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x03" function="0x7"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="17" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="17" port="0x20"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x0" multifunction="on"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="18" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="18" port="0x21"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x1"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="19" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="19" port="0x22"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x2"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="20" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="20" port="0x23"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x3"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="21" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="21" port="0x24"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x4"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="22" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="22" port="0x25"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x5"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="23" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="23" port="0x26"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x6"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="24" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="24" port="0x27"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x04" function="0x7"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="25" model="pcie-root-port">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-root-port"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target chassis="25" port="0x28"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x05" function="0x0"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="pci" index="26" model="pcie-to-pci-bridge">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model name="pcie-pci-bridge"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x01" slot="0x00" function="0x0"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="usb" index="0" model="piix3-uhci">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x1a" slot="0x01" function="0x0"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <controller type="sata" index="0">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x1f" function="0x2"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </controller>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <interface type="ethernet"><mac address="fa:16:3e:d6:c0:ac"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapbf4216e4-2e"/><address type="pci" domain="0x0000" bus="0x02" slot="0x00" function="0x0"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </interface><serial type="pty">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <log file="/var/lib/nova/instances/396e11ed-839f-4c1e-be94-410ca9634b50/console.log" append="off"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target type="isa-serial" port="0">
Sep 30 07:35:30 compute-0 nova_compute[189265]:         <model name="isa-serial"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       </target>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </serial>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <console type="pty">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <log file="/var/lib/nova/instances/396e11ed-839f-4c1e-be94-410ca9634b50/console.log" append="off"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <target type="serial" port="0"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </console>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <input type="tablet" bus="usb">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="usb" bus="0" port="1"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </input>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <input type="mouse" bus="ps2"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <graphics type="vnc" port="-1" autoport="yes" listen="::">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <listen type="address" address="::"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </graphics>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <video>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <model type="virtio" heads="1" primary="yes"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x00" slot="0x01" function="0x0"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </video>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <stats period="10"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </memballoon>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     <rng model="virtio">
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <backend model="random">/dev/urandom</backend>
Sep 30 07:35:30 compute-0 nova_compute[189265]:       <address type="pci" domain="0x0000" bus="0x05" slot="0x00" function="0x0"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]:     </rng>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   </devices>
Sep 30 07:35:30 compute-0 nova_compute[189265]:   <seclabel type="dynamic" model="selinux" relabel="yes"/>
Sep 30 07:35:30 compute-0 nova_compute[189265]: </domain>
Sep 30 07:35:30 compute-0 nova_compute[189265]:  _update_pci_dev_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:166
Sep 30 07:35:30 compute-0 nova_compute[189265]: 2025-09-30 07:35:30.835 2 DEBUG nova.virt.libvirt.driver [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] About to invoke the migrate API _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11175
Sep 30 07:35:31 compute-0 nova_compute[189265]: 2025-09-30 07:35:31.310 2 DEBUG nova.virt.libvirt.migration [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Current None elapsed 1 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Sep 30 07:35:31 compute-0 nova_compute[189265]: 2025-09-30 07:35:31.310 2 INFO nova.virt.libvirt.migration [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Increasing downtime to 50 ms after 0 sec elapsed time
Sep 30 07:35:31 compute-0 openstack_network_exporter[201859]: ERROR   07:35:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:35:31 compute-0 openstack_network_exporter[201859]: ERROR   07:35:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:35:31 compute-0 openstack_network_exporter[201859]: ERROR   07:35:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:35:31 compute-0 openstack_network_exporter[201859]: ERROR   07:35:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:35:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:35:31 compute-0 openstack_network_exporter[201859]: ERROR   07:35:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:35:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:35:32 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:35:32.090 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=01429670-4ea1-4dab-babc-4bc628cc01bb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:35:32 compute-0 nova_compute[189265]: 2025-09-30 07:35:32.419 2 INFO nova.virt.libvirt.driver [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Migration running for 1 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Sep 30 07:35:32 compute-0 nova_compute[189265]: 2025-09-30 07:35:32.787 2 WARNING neutronclient.v2_0.client [req-c0baf670-7254-489b-8ef2-815dda43b18d req-b712ac44-a404-4b42-9a42-6d604c54821b 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:35:32 compute-0 nova_compute[189265]: 2025-09-30 07:35:32.922 2 DEBUG nova.virt.libvirt.migration [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Current 50 elapsed 2 steps [(0, 50), (300, 95), (600, 140), (900, 185), (1200, 230), (1500, 275), (1800, 320), (2100, 365), (2400, 410), (2700, 455), (3000, 500)] update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:658
Sep 30 07:35:32 compute-0 nova_compute[189265]: 2025-09-30 07:35:32.923 2 DEBUG nova.virt.libvirt.migration [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Downtime does not need to change update_downtime /usr/lib/python3.12/site-packages/nova/virt/libvirt/migration.py:671
Sep 30 07:35:32 compute-0 nova_compute[189265]: 2025-09-30 07:35:32.979 2 DEBUG nova.network.neutron [req-c0baf670-7254-489b-8ef2-815dda43b18d req-b712ac44-a404-4b42-9a42-6d604c54821b 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Updated VIF entry in instance network info cache for port bf4216e4-2e68-4aff-8fec-7de17189c27c. _build_network_info_model /usr/lib/python3.12/site-packages/nova/network/neutron.py:3542
Sep 30 07:35:32 compute-0 nova_compute[189265]: 2025-09-30 07:35:32.980 2 DEBUG nova.network.neutron [req-c0baf670-7254-489b-8ef2-815dda43b18d req-b712ac44-a404-4b42-9a42-6d604c54821b 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Updating instance_info_cache with network_info: [{"id": "bf4216e4-2e68-4aff-8fec-7de17189c27c", "address": "fa:16:3e:d6:c0:ac", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf4216e4-2e", "ovs_interfaceid": "bf4216e4-2e68-4aff-8fec-7de17189c27c", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-1.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:35:33 compute-0 kernel: tapbf4216e4-2e (unregistering): left promiscuous mode
Sep 30 07:35:33 compute-0 NetworkManager[51813]: <info>  [1759217733.0248] device (tapbf4216e4-2e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 07:35:33 compute-0 ovn_controller[91436]: 2025-09-30T07:35:33Z|00205|binding|INFO|Releasing lport bf4216e4-2e68-4aff-8fec-7de17189c27c from this chassis (sb_readonly=0)
Sep 30 07:35:33 compute-0 ovn_controller[91436]: 2025-09-30T07:35:33Z|00206|binding|INFO|Setting lport bf4216e4-2e68-4aff-8fec-7de17189c27c down in Southbound
Sep 30 07:35:33 compute-0 ovn_controller[91436]: 2025-09-30T07:35:33Z|00207|binding|INFO|Removing iface tapbf4216e4-2e ovn-installed in OVS
Sep 30 07:35:33 compute-0 nova_compute[189265]: 2025-09-30 07:35:33.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:35:33 compute-0 nova_compute[189265]: 2025-09-30 07:35:33.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:35:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:35:33.037 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d6:c0:ac 10.100.0.5'], port_security=['fa:16:3e:d6:c0:ac 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '8a9138ed-8977-41ff-9b21-ff90eb637e78'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '396e11ed-839f-4c1e-be94-410ca9634b50', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c99c822b-3191-49e5-b938-903e25b4a9bb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6431607f3dce4c88bbf6d17ee6cd45b2', 'neutron:revision_number': '10', 'neutron:security_group_ids': '39e9818d-6ede-4a3d-b6e2-a5ad3a4c803a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0bbcb02d-e040-4e0e-9a60-6466c4420133, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], logical_port=bf4216e4-2e68-4aff-8fec-7de17189c27c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:35:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:35:33.038 100322 INFO neutron.agent.ovn.metadata.agent [-] Port bf4216e4-2e68-4aff-8fec-7de17189c27c in datapath c99c822b-3191-49e5-b938-903e25b4a9bb unbound from our chassis
Sep 30 07:35:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:35:33.040 100322 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c99c822b-3191-49e5-b938-903e25b4a9bb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 07:35:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:35:33.040 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[a930f142-3878-42ed-a6a6-60304c93288b]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:35:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:35:33.041 100322 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb namespace which is not needed anymore
Sep 30 07:35:33 compute-0 nova_compute[189265]: 2025-09-30 07:35:33.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:35:33 compute-0 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000015.scope: Deactivated successfully.
Sep 30 07:35:33 compute-0 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000015.scope: Consumed 14.224s CPU time.
Sep 30 07:35:33 compute-0 systemd-machined[149233]: Machine qemu-16-instance-00000015 terminated.
Sep 30 07:35:33 compute-0 neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb[220689]: [NOTICE]   (220693) : haproxy version is 3.0.5-8e879a5
Sep 30 07:35:33 compute-0 neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb[220689]: [NOTICE]   (220693) : path to executable is /usr/sbin/haproxy
Sep 30 07:35:33 compute-0 neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb[220689]: [WARNING]  (220693) : Exiting Master process...
Sep 30 07:35:33 compute-0 podman[220997]: 2025-09-30 07:35:33.187792773 +0000 UTC m=+0.042180346 container kill eb0898251616c836e52e1cfab38621e33c8a5da14729f28618888012a196bf70 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 07:35:33 compute-0 neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb[220689]: [ALERT]    (220693) : Current worker (220695) exited with code 143 (Terminated)
Sep 30 07:35:33 compute-0 neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb[220689]: [WARNING]  (220693) : All workers exited. Exiting... (0)
Sep 30 07:35:33 compute-0 systemd[1]: libpod-eb0898251616c836e52e1cfab38621e33c8a5da14729f28618888012a196bf70.scope: Deactivated successfully.
Sep 30 07:35:33 compute-0 kernel: tapbf4216e4-2e: entered promiscuous mode
Sep 30 07:35:33 compute-0 NetworkManager[51813]: <info>  [1759217733.2239] manager: (tapbf4216e4-2e): new Tun device (/org/freedesktop/NetworkManager/Devices/74)
Sep 30 07:35:33 compute-0 systemd-udevd[220979]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 07:35:33 compute-0 kernel: tapbf4216e4-2e (unregistering): left promiscuous mode
Sep 30 07:35:33 compute-0 nova_compute[189265]: 2025-09-30 07:35:33.227 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:35:33 compute-0 ovn_controller[91436]: 2025-09-30T07:35:33Z|00208|binding|INFO|Claiming lport bf4216e4-2e68-4aff-8fec-7de17189c27c for this chassis.
Sep 30 07:35:33 compute-0 ovn_controller[91436]: 2025-09-30T07:35:33Z|00209|binding|INFO|bf4216e4-2e68-4aff-8fec-7de17189c27c: Claiming fa:16:3e:d6:c0:ac 10.100.0.5
Sep 30 07:35:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:35:33.239 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d6:c0:ac 10.100.0.5'], port_security=['fa:16:3e:d6:c0:ac 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '8a9138ed-8977-41ff-9b21-ff90eb637e78'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '396e11ed-839f-4c1e-be94-410ca9634b50', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c99c822b-3191-49e5-b938-903e25b4a9bb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6431607f3dce4c88bbf6d17ee6cd45b2', 'neutron:revision_number': '10', 'neutron:security_group_ids': '39e9818d-6ede-4a3d-b6e2-a5ad3a4c803a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0bbcb02d-e040-4e0e-9a60-6466c4420133, chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], logical_port=bf4216e4-2e68-4aff-8fec-7de17189c27c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:35:33 compute-0 podman[221012]: 2025-09-30 07:35:33.249629734 +0000 UTC m=+0.042126364 container died eb0898251616c836e52e1cfab38621e33c8a5da14729f28618888012a196bf70 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Sep 30 07:35:33 compute-0 nova_compute[189265]: 2025-09-30 07:35:33.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:35:33 compute-0 ovn_controller[91436]: 2025-09-30T07:35:33Z|00210|binding|INFO|Releasing lport bf4216e4-2e68-4aff-8fec-7de17189c27c from this chassis (sb_readonly=0)
Sep 30 07:35:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:35:33.266 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d6:c0:ac 10.100.0.5'], port_security=['fa:16:3e:d6:c0:ac 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com,compute-1.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': '8a9138ed-8977-41ff-9b21-ff90eb637e78'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '396e11ed-839f-4c1e-be94-410ca9634b50', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c99c822b-3191-49e5-b938-903e25b4a9bb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6431607f3dce4c88bbf6d17ee6cd45b2', 'neutron:revision_number': '10', 'neutron:security_group_ids': '39e9818d-6ede-4a3d-b6e2-a5ad3a4c803a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0bbcb02d-e040-4e0e-9a60-6466c4420133, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], logical_port=bf4216e4-2e68-4aff-8fec-7de17189c27c) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:35:33 compute-0 nova_compute[189265]: 2025-09-30 07:35:33.293 2 DEBUG nova.virt.libvirt.driver [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Migrate API has completed _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11182
Sep 30 07:35:33 compute-0 nova_compute[189265]: 2025-09-30 07:35:33.294 2 DEBUG nova.virt.libvirt.driver [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Migration operation thread has finished _live_migration_operation /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11230
Sep 30 07:35:33 compute-0 nova_compute[189265]: 2025-09-30 07:35:33.294 2 DEBUG nova.virt.libvirt.driver [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Migration operation thread notification thread_finished /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11533
Sep 30 07:35:33 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-eb0898251616c836e52e1cfab38621e33c8a5da14729f28618888012a196bf70-userdata-shm.mount: Deactivated successfully.
Sep 30 07:35:33 compute-0 systemd[1]: var-lib-containers-storage-overlay-6ac26c584db9f778b3c1f91dd38a68566debe66fd6bf6559b736cdd6f3439f20-merged.mount: Deactivated successfully.
Sep 30 07:35:33 compute-0 nova_compute[189265]: 2025-09-30 07:35:33.425 2 DEBUG nova.virt.libvirt.guest [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '396e11ed-839f-4c1e-be94-410ca9634b50' (instance-00000015) get_job_info /usr/lib/python3.12/site-packages/nova/virt/libvirt/guest.py:687
Sep 30 07:35:33 compute-0 nova_compute[189265]: 2025-09-30 07:35:33.426 2 INFO nova.virt.libvirt.driver [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Migration operation has completed
Sep 30 07:35:33 compute-0 nova_compute[189265]: 2025-09-30 07:35:33.426 2 INFO nova.compute.manager [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] _post_live_migration() is started..
Sep 30 07:35:33 compute-0 nova_compute[189265]: 2025-09-30 07:35:33.439 2 WARNING neutronclient.v2_0.client [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:35:33 compute-0 nova_compute[189265]: 2025-09-30 07:35:33.440 2 WARNING neutronclient.v2_0.client [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:35:33 compute-0 nova_compute[189265]: 2025-09-30 07:35:33.486 2 DEBUG oslo_concurrency.lockutils [req-c0baf670-7254-489b-8ef2-815dda43b18d req-b712ac44-a404-4b42-9a42-6d604c54821b 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Releasing lock "refresh_cache-396e11ed-839f-4c1e-be94-410ca9634b50" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:35:33 compute-0 podman[221012]: 2025-09-30 07:35:33.550617352 +0000 UTC m=+0.343113942 container cleanup eb0898251616c836e52e1cfab38621e33c8a5da14729f28618888012a196bf70 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 07:35:33 compute-0 systemd[1]: libpod-conmon-eb0898251616c836e52e1cfab38621e33c8a5da14729f28618888012a196bf70.scope: Deactivated successfully.
Sep 30 07:35:33 compute-0 podman[221027]: 2025-09-30 07:35:33.635320671 +0000 UTC m=+0.385467502 container remove eb0898251616c836e52e1cfab38621e33c8a5da14729f28618888012a196bf70 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Sep 30 07:35:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:35:33.643 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[19917f3e-2da2-46bb-a98e-04cd685adb98]: (4, ("Tue Sep 30 07:35:33 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb (eb0898251616c836e52e1cfab38621e33c8a5da14729f28618888012a196bf70)\neb0898251616c836e52e1cfab38621e33c8a5da14729f28618888012a196bf70\nTue Sep 30 07:35:33 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb (eb0898251616c836e52e1cfab38621e33c8a5da14729f28618888012a196bf70)\neb0898251616c836e52e1cfab38621e33c8a5da14729f28618888012a196bf70\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:35:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:35:33.645 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[f75ec0f6-3feb-4534-a39f-e6a71091dc57]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:35:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:35:33.646 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:35:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:35:33.647 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[2ebdbfac-f713-4d1a-8806-4c658e11f7b1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:35:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:35:33.647 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc99c822b-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:35:33 compute-0 nova_compute[189265]: 2025-09-30 07:35:33.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:35:33 compute-0 kernel: tapc99c822b-30: left promiscuous mode
Sep 30 07:35:33 compute-0 nova_compute[189265]: 2025-09-30 07:35:33.678 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:35:33 compute-0 nova_compute[189265]: 2025-09-30 07:35:33.678 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:35:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:35:33.682 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[cb055dc7-4f62-4d4a-810a-0e30bd29e5cc]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:35:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:35:33.708 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[eccaddbc-2a68-4955-8958-a060775b3ea2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:35:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:35:33.709 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[d31caeb2-07c3-4c84-8393-ed34862d9c41]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:35:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:35:33.732 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[30e4724b-a194-4b58-a8dd-0fa1fd538587]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 554000, 'reachable_time': 37644, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221055, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:35:33 compute-0 systemd[1]: run-netns-ovnmeta\x2dc99c822b\x2d3191\x2d49e5\x2db938\x2d903e25b4a9bb.mount: Deactivated successfully.
Sep 30 07:35:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:35:33.735 100440 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Sep 30 07:35:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:35:33.735 100440 DEBUG oslo.privsep.daemon [-] privsep: reply[058bffe0-dbe7-410a-9bb6-78fc261e2799]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:35:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:35:33.735 100322 INFO neutron.agent.ovn.metadata.agent [-] Port bf4216e4-2e68-4aff-8fec-7de17189c27c in datapath c99c822b-3191-49e5-b938-903e25b4a9bb unbound from our chassis
Sep 30 07:35:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:35:33.736 100322 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c99c822b-3191-49e5-b938-903e25b4a9bb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 07:35:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:35:33.737 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[f2c04900-5062-415a-b994-efe4a2f6d387]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:35:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:35:33.738 100322 INFO neutron.agent.ovn.metadata.agent [-] Port bf4216e4-2e68-4aff-8fec-7de17189c27c in datapath c99c822b-3191-49e5-b938-903e25b4a9bb unbound from our chassis
Sep 30 07:35:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:35:33.738 100322 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c99c822b-3191-49e5-b938-903e25b4a9bb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 07:35:33 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:35:33.739 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[0ec780d6-13c5-4c07-83e1-d803377fdf2d]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:35:33 compute-0 nova_compute[189265]: 2025-09-30 07:35:33.928 2 DEBUG nova.compute.manager [req-4482d577-f8e6-44f2-b6c6-ced0c78e7a7f req-69f071dc-87cd-4f25-9bd7-4b953149890b 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Received event network-vif-unplugged-bf4216e4-2e68-4aff-8fec-7de17189c27c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:35:33 compute-0 nova_compute[189265]: 2025-09-30 07:35:33.929 2 DEBUG oslo_concurrency.lockutils [req-4482d577-f8e6-44f2-b6c6-ced0c78e7a7f req-69f071dc-87cd-4f25-9bd7-4b953149890b 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "396e11ed-839f-4c1e-be94-410ca9634b50-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:35:33 compute-0 nova_compute[189265]: 2025-09-30 07:35:33.929 2 DEBUG oslo_concurrency.lockutils [req-4482d577-f8e6-44f2-b6c6-ced0c78e7a7f req-69f071dc-87cd-4f25-9bd7-4b953149890b 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "396e11ed-839f-4c1e-be94-410ca9634b50-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:35:33 compute-0 nova_compute[189265]: 2025-09-30 07:35:33.930 2 DEBUG oslo_concurrency.lockutils [req-4482d577-f8e6-44f2-b6c6-ced0c78e7a7f req-69f071dc-87cd-4f25-9bd7-4b953149890b 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "396e11ed-839f-4c1e-be94-410ca9634b50-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:35:33 compute-0 nova_compute[189265]: 2025-09-30 07:35:33.930 2 DEBUG nova.compute.manager [req-4482d577-f8e6-44f2-b6c6-ced0c78e7a7f req-69f071dc-87cd-4f25-9bd7-4b953149890b 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] No waiting events found dispatching network-vif-unplugged-bf4216e4-2e68-4aff-8fec-7de17189c27c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:35:33 compute-0 nova_compute[189265]: 2025-09-30 07:35:33.931 2 DEBUG nova.compute.manager [req-4482d577-f8e6-44f2-b6c6-ced0c78e7a7f req-69f071dc-87cd-4f25-9bd7-4b953149890b 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Received event network-vif-unplugged-bf4216e4-2e68-4aff-8fec-7de17189c27c for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:35:34 compute-0 nova_compute[189265]: 2025-09-30 07:35:34.051 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:35:34 compute-0 nova_compute[189265]: 2025-09-30 07:35:34.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:35:34 compute-0 nova_compute[189265]: 2025-09-30 07:35:34.502 2 DEBUG nova.network.neutron [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Activated binding for port bf4216e4-2e68-4aff-8fec-7de17189c27c and host compute-1.ctlplane.example.com migrate_instance_start /usr/lib/python3.12/site-packages/nova/network/neutron.py:3241
Sep 30 07:35:34 compute-0 nova_compute[189265]: 2025-09-30 07:35:34.503 2 DEBUG nova.compute.manager [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "bf4216e4-2e68-4aff-8fec-7de17189c27c", "address": "fa:16:3e:d6:c0:ac", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf4216e4-2e", "ovs_interfaceid": "bf4216e4-2e68-4aff-8fec-7de17189c27c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10059
Sep 30 07:35:34 compute-0 nova_compute[189265]: 2025-09-30 07:35:34.505 2 DEBUG nova.virt.libvirt.vif [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-09-30T07:34:26Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-314749624',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-314749624',id=21,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T07:34:43Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6431607f3dce4c88bbf6d17ee6cd45b2',ramdisk_id='',reservation_id='r-qh7t6g2f',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1096120513',owner_user_name='tempest-TestExecuteStrategies-1096120513-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T07:35:11Z,user_data=None,user_id='89ba5d19014145188ad2a3c812acdc88',uuid=396e11ed-839f-4c1e-be94-410ca9634b50,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bf4216e4-2e68-4aff-8fec-7de17189c27c", "address": "fa:16:3e:d6:c0:ac", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf4216e4-2e", "ovs_interfaceid": "bf4216e4-2e68-4aff-8fec-7de17189c27c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 07:35:34 compute-0 nova_compute[189265]: 2025-09-30 07:35:34.505 2 DEBUG nova.network.os_vif_util [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Converting VIF {"id": "bf4216e4-2e68-4aff-8fec-7de17189c27c", "address": "fa:16:3e:d6:c0:ac", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbf4216e4-2e", "ovs_interfaceid": "bf4216e4-2e68-4aff-8fec-7de17189c27c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:35:34 compute-0 nova_compute[189265]: 2025-09-30 07:35:34.506 2 DEBUG nova.network.os_vif_util [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d6:c0:ac,bridge_name='br-int',has_traffic_filtering=True,id=bf4216e4-2e68-4aff-8fec-7de17189c27c,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbf4216e4-2e') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:35:34 compute-0 nova_compute[189265]: 2025-09-30 07:35:34.507 2 DEBUG os_vif [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d6:c0:ac,bridge_name='br-int',has_traffic_filtering=True,id=bf4216e4-2e68-4aff-8fec-7de17189c27c,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbf4216e4-2e') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 07:35:34 compute-0 nova_compute[189265]: 2025-09-30 07:35:34.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:35:34 compute-0 nova_compute[189265]: 2025-09-30 07:35:34.510 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbf4216e4-2e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:35:34 compute-0 nova_compute[189265]: 2025-09-30 07:35:34.513 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:35:34 compute-0 nova_compute[189265]: 2025-09-30 07:35:34.516 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:35:34 compute-0 nova_compute[189265]: 2025-09-30 07:35:34.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:35:34 compute-0 nova_compute[189265]: 2025-09-30 07:35:34.518 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=0c9ede59-cb28-4b47-bca5-09ec14d5dbfc) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:35:34 compute-0 nova_compute[189265]: 2025-09-30 07:35:34.519 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:35:34 compute-0 nova_compute[189265]: 2025-09-30 07:35:34.521 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:35:34 compute-0 nova_compute[189265]: 2025-09-30 07:35:34.523 2 INFO os_vif [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d6:c0:ac,bridge_name='br-int',has_traffic_filtering=True,id=bf4216e4-2e68-4aff-8fec-7de17189c27c,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbf4216e4-2e')
Sep 30 07:35:34 compute-0 nova_compute[189265]: 2025-09-30 07:35:34.524 2 DEBUG oslo_concurrency.lockutils [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:35:34 compute-0 nova_compute[189265]: 2025-09-30 07:35:34.525 2 DEBUG oslo_concurrency.lockutils [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:35:34 compute-0 nova_compute[189265]: 2025-09-30 07:35:34.525 2 DEBUG oslo_concurrency.lockutils [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:35:34 compute-0 nova_compute[189265]: 2025-09-30 07:35:34.526 2 DEBUG nova.compute.manager [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:10082
Sep 30 07:35:34 compute-0 nova_compute[189265]: 2025-09-30 07:35:34.526 2 INFO nova.virt.libvirt.driver [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Deleting instance files /var/lib/nova/instances/396e11ed-839f-4c1e-be94-410ca9634b50_del
Sep 30 07:35:34 compute-0 nova_compute[189265]: 2025-09-30 07:35:34.527 2 INFO nova.virt.libvirt.driver [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Deletion of /var/lib/nova/instances/396e11ed-839f-4c1e-be94-410ca9634b50_del complete
Sep 30 07:35:36 compute-0 nova_compute[189265]: 2025-09-30 07:35:36.027 2 DEBUG nova.compute.manager [req-a541f42e-4ef5-4868-a897-c62a7da91437 req-07a51690-e28a-4e22-b362-9237867c25e3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Received event network-vif-plugged-bf4216e4-2e68-4aff-8fec-7de17189c27c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:35:36 compute-0 nova_compute[189265]: 2025-09-30 07:35:36.028 2 DEBUG oslo_concurrency.lockutils [req-a541f42e-4ef5-4868-a897-c62a7da91437 req-07a51690-e28a-4e22-b362-9237867c25e3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "396e11ed-839f-4c1e-be94-410ca9634b50-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:35:36 compute-0 nova_compute[189265]: 2025-09-30 07:35:36.028 2 DEBUG oslo_concurrency.lockutils [req-a541f42e-4ef5-4868-a897-c62a7da91437 req-07a51690-e28a-4e22-b362-9237867c25e3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "396e11ed-839f-4c1e-be94-410ca9634b50-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:35:36 compute-0 nova_compute[189265]: 2025-09-30 07:35:36.028 2 DEBUG oslo_concurrency.lockutils [req-a541f42e-4ef5-4868-a897-c62a7da91437 req-07a51690-e28a-4e22-b362-9237867c25e3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "396e11ed-839f-4c1e-be94-410ca9634b50-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:35:36 compute-0 nova_compute[189265]: 2025-09-30 07:35:36.028 2 DEBUG nova.compute.manager [req-a541f42e-4ef5-4868-a897-c62a7da91437 req-07a51690-e28a-4e22-b362-9237867c25e3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] No waiting events found dispatching network-vif-plugged-bf4216e4-2e68-4aff-8fec-7de17189c27c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:35:36 compute-0 nova_compute[189265]: 2025-09-30 07:35:36.029 2 WARNING nova.compute.manager [req-a541f42e-4ef5-4868-a897-c62a7da91437 req-07a51690-e28a-4e22-b362-9237867c25e3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Received unexpected event network-vif-plugged-bf4216e4-2e68-4aff-8fec-7de17189c27c for instance with vm_state active and task_state migrating.
Sep 30 07:35:36 compute-0 nova_compute[189265]: 2025-09-30 07:35:36.029 2 DEBUG nova.compute.manager [req-a541f42e-4ef5-4868-a897-c62a7da91437 req-07a51690-e28a-4e22-b362-9237867c25e3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Received event network-vif-unplugged-bf4216e4-2e68-4aff-8fec-7de17189c27c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:35:36 compute-0 nova_compute[189265]: 2025-09-30 07:35:36.029 2 DEBUG oslo_concurrency.lockutils [req-a541f42e-4ef5-4868-a897-c62a7da91437 req-07a51690-e28a-4e22-b362-9237867c25e3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "396e11ed-839f-4c1e-be94-410ca9634b50-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:35:36 compute-0 nova_compute[189265]: 2025-09-30 07:35:36.029 2 DEBUG oslo_concurrency.lockutils [req-a541f42e-4ef5-4868-a897-c62a7da91437 req-07a51690-e28a-4e22-b362-9237867c25e3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "396e11ed-839f-4c1e-be94-410ca9634b50-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:35:36 compute-0 nova_compute[189265]: 2025-09-30 07:35:36.030 2 DEBUG oslo_concurrency.lockutils [req-a541f42e-4ef5-4868-a897-c62a7da91437 req-07a51690-e28a-4e22-b362-9237867c25e3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "396e11ed-839f-4c1e-be94-410ca9634b50-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:35:36 compute-0 nova_compute[189265]: 2025-09-30 07:35:36.030 2 DEBUG nova.compute.manager [req-a541f42e-4ef5-4868-a897-c62a7da91437 req-07a51690-e28a-4e22-b362-9237867c25e3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] No waiting events found dispatching network-vif-unplugged-bf4216e4-2e68-4aff-8fec-7de17189c27c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:35:36 compute-0 nova_compute[189265]: 2025-09-30 07:35:36.030 2 DEBUG nova.compute.manager [req-a541f42e-4ef5-4868-a897-c62a7da91437 req-07a51690-e28a-4e22-b362-9237867c25e3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Received event network-vif-unplugged-bf4216e4-2e68-4aff-8fec-7de17189c27c for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:35:36 compute-0 nova_compute[189265]: 2025-09-30 07:35:36.030 2 DEBUG nova.compute.manager [req-a541f42e-4ef5-4868-a897-c62a7da91437 req-07a51690-e28a-4e22-b362-9237867c25e3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Received event network-vif-unplugged-bf4216e4-2e68-4aff-8fec-7de17189c27c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:35:36 compute-0 nova_compute[189265]: 2025-09-30 07:35:36.031 2 DEBUG oslo_concurrency.lockutils [req-a541f42e-4ef5-4868-a897-c62a7da91437 req-07a51690-e28a-4e22-b362-9237867c25e3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "396e11ed-839f-4c1e-be94-410ca9634b50-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:35:36 compute-0 nova_compute[189265]: 2025-09-30 07:35:36.031 2 DEBUG oslo_concurrency.lockutils [req-a541f42e-4ef5-4868-a897-c62a7da91437 req-07a51690-e28a-4e22-b362-9237867c25e3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "396e11ed-839f-4c1e-be94-410ca9634b50-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:35:36 compute-0 nova_compute[189265]: 2025-09-30 07:35:36.031 2 DEBUG oslo_concurrency.lockutils [req-a541f42e-4ef5-4868-a897-c62a7da91437 req-07a51690-e28a-4e22-b362-9237867c25e3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "396e11ed-839f-4c1e-be94-410ca9634b50-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:35:36 compute-0 nova_compute[189265]: 2025-09-30 07:35:36.031 2 DEBUG nova.compute.manager [req-a541f42e-4ef5-4868-a897-c62a7da91437 req-07a51690-e28a-4e22-b362-9237867c25e3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] No waiting events found dispatching network-vif-unplugged-bf4216e4-2e68-4aff-8fec-7de17189c27c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:35:36 compute-0 nova_compute[189265]: 2025-09-30 07:35:36.032 2 DEBUG nova.compute.manager [req-a541f42e-4ef5-4868-a897-c62a7da91437 req-07a51690-e28a-4e22-b362-9237867c25e3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Received event network-vif-unplugged-bf4216e4-2e68-4aff-8fec-7de17189c27c for instance with task_state migrating. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:35:36 compute-0 nova_compute[189265]: 2025-09-30 07:35:36.032 2 DEBUG nova.compute.manager [req-a541f42e-4ef5-4868-a897-c62a7da91437 req-07a51690-e28a-4e22-b362-9237867c25e3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Received event network-vif-plugged-bf4216e4-2e68-4aff-8fec-7de17189c27c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:35:36 compute-0 nova_compute[189265]: 2025-09-30 07:35:36.032 2 DEBUG oslo_concurrency.lockutils [req-a541f42e-4ef5-4868-a897-c62a7da91437 req-07a51690-e28a-4e22-b362-9237867c25e3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "396e11ed-839f-4c1e-be94-410ca9634b50-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:35:36 compute-0 nova_compute[189265]: 2025-09-30 07:35:36.032 2 DEBUG oslo_concurrency.lockutils [req-a541f42e-4ef5-4868-a897-c62a7da91437 req-07a51690-e28a-4e22-b362-9237867c25e3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "396e11ed-839f-4c1e-be94-410ca9634b50-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:35:36 compute-0 nova_compute[189265]: 2025-09-30 07:35:36.033 2 DEBUG oslo_concurrency.lockutils [req-a541f42e-4ef5-4868-a897-c62a7da91437 req-07a51690-e28a-4e22-b362-9237867c25e3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "396e11ed-839f-4c1e-be94-410ca9634b50-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:35:36 compute-0 nova_compute[189265]: 2025-09-30 07:35:36.033 2 DEBUG nova.compute.manager [req-a541f42e-4ef5-4868-a897-c62a7da91437 req-07a51690-e28a-4e22-b362-9237867c25e3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] No waiting events found dispatching network-vif-plugged-bf4216e4-2e68-4aff-8fec-7de17189c27c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:35:36 compute-0 nova_compute[189265]: 2025-09-30 07:35:36.033 2 WARNING nova.compute.manager [req-a541f42e-4ef5-4868-a897-c62a7da91437 req-07a51690-e28a-4e22-b362-9237867c25e3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Received unexpected event network-vif-plugged-bf4216e4-2e68-4aff-8fec-7de17189c27c for instance with vm_state active and task_state migrating.
Sep 30 07:35:36 compute-0 nova_compute[189265]: 2025-09-30 07:35:36.034 2 DEBUG nova.compute.manager [req-a541f42e-4ef5-4868-a897-c62a7da91437 req-07a51690-e28a-4e22-b362-9237867c25e3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Received event network-vif-plugged-bf4216e4-2e68-4aff-8fec-7de17189c27c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:35:36 compute-0 nova_compute[189265]: 2025-09-30 07:35:36.034 2 DEBUG oslo_concurrency.lockutils [req-a541f42e-4ef5-4868-a897-c62a7da91437 req-07a51690-e28a-4e22-b362-9237867c25e3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "396e11ed-839f-4c1e-be94-410ca9634b50-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:35:36 compute-0 nova_compute[189265]: 2025-09-30 07:35:36.034 2 DEBUG oslo_concurrency.lockutils [req-a541f42e-4ef5-4868-a897-c62a7da91437 req-07a51690-e28a-4e22-b362-9237867c25e3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "396e11ed-839f-4c1e-be94-410ca9634b50-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:35:36 compute-0 nova_compute[189265]: 2025-09-30 07:35:36.034 2 DEBUG oslo_concurrency.lockutils [req-a541f42e-4ef5-4868-a897-c62a7da91437 req-07a51690-e28a-4e22-b362-9237867c25e3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "396e11ed-839f-4c1e-be94-410ca9634b50-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:35:36 compute-0 nova_compute[189265]: 2025-09-30 07:35:36.035 2 DEBUG nova.compute.manager [req-a541f42e-4ef5-4868-a897-c62a7da91437 req-07a51690-e28a-4e22-b362-9237867c25e3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] No waiting events found dispatching network-vif-plugged-bf4216e4-2e68-4aff-8fec-7de17189c27c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:35:36 compute-0 nova_compute[189265]: 2025-09-30 07:35:36.035 2 WARNING nova.compute.manager [req-a541f42e-4ef5-4868-a897-c62a7da91437 req-07a51690-e28a-4e22-b362-9237867c25e3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Received unexpected event network-vif-plugged-bf4216e4-2e68-4aff-8fec-7de17189c27c for instance with vm_state active and task_state migrating.
Sep 30 07:35:37 compute-0 podman[221056]: 2025-09-30 07:35:37.493661288 +0000 UTC m=+0.071024647 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 07:35:39 compute-0 nova_compute[189265]: 2025-09-30 07:35:39.066 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:35:39 compute-0 nova_compute[189265]: 2025-09-30 07:35:39.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:35:43 compute-0 nova_compute[189265]: 2025-09-30 07:35:43.062 2 DEBUG oslo_concurrency.lockutils [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "396e11ed-839f-4c1e-be94-410ca9634b50-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:35:43 compute-0 nova_compute[189265]: 2025-09-30 07:35:43.062 2 DEBUG oslo_concurrency.lockutils [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "396e11ed-839f-4c1e-be94-410ca9634b50-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:35:43 compute-0 nova_compute[189265]: 2025-09-30 07:35:43.063 2 DEBUG oslo_concurrency.lockutils [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "396e11ed-839f-4c1e-be94-410ca9634b50-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:35:43 compute-0 nova_compute[189265]: 2025-09-30 07:35:43.575 2 DEBUG oslo_concurrency.lockutils [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:35:43 compute-0 nova_compute[189265]: 2025-09-30 07:35:43.576 2 DEBUG oslo_concurrency.lockutils [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:35:43 compute-0 nova_compute[189265]: 2025-09-30 07:35:43.576 2 DEBUG oslo_concurrency.lockutils [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:35:43 compute-0 nova_compute[189265]: 2025-09-30 07:35:43.577 2 DEBUG nova.compute.resource_tracker [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:35:43 compute-0 nova_compute[189265]: 2025-09-30 07:35:43.783 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:35:43 compute-0 nova_compute[189265]: 2025-09-30 07:35:43.810 2 WARNING nova.virt.libvirt.driver [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:35:43 compute-0 nova_compute[189265]: 2025-09-30 07:35:43.812 2 DEBUG oslo_concurrency.processutils [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:35:43 compute-0 nova_compute[189265]: 2025-09-30 07:35:43.837 2 DEBUG oslo_concurrency.processutils [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "env LANG=C uptime" returned: 0 in 0.025s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:35:43 compute-0 nova_compute[189265]: 2025-09-30 07:35:43.838 2 DEBUG nova.compute.resource_tracker [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5851MB free_disk=73.30375289916992GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:35:43 compute-0 nova_compute[189265]: 2025-09-30 07:35:43.839 2 DEBUG oslo_concurrency.lockutils [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:35:43 compute-0 nova_compute[189265]: 2025-09-30 07:35:43.839 2 DEBUG oslo_concurrency.lockutils [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:35:44 compute-0 nova_compute[189265]: 2025-09-30 07:35:44.068 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:35:44 compute-0 nova_compute[189265]: 2025-09-30 07:35:44.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:35:45 compute-0 nova_compute[189265]: 2025-09-30 07:35:45.002 2 DEBUG nova.compute.resource_tracker [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Migration for instance 396e11ed-839f-4c1e-be94-410ca9634b50 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Sep 30 07:35:45 compute-0 nova_compute[189265]: 2025-09-30 07:35:45.537 2 DEBUG nova.compute.resource_tracker [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Sep 30 07:35:45 compute-0 nova_compute[189265]: 2025-09-30 07:35:45.574 2 DEBUG nova.compute.resource_tracker [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Migration a52c5996-3eca-46c5-a9b4-2623668d7600 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1745
Sep 30 07:35:45 compute-0 nova_compute[189265]: 2025-09-30 07:35:45.575 2 DEBUG nova.compute.resource_tracker [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:35:45 compute-0 nova_compute[189265]: 2025-09-30 07:35:45.576 2 DEBUG nova.compute.resource_tracker [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:35:43 up  1:33,  0 user,  load average: 0.49, 0.32, 0.30\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:35:45 compute-0 nova_compute[189265]: 2025-09-30 07:35:45.616 2 DEBUG nova.compute.provider_tree [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:35:45 compute-0 nova_compute[189265]: 2025-09-30 07:35:45.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:35:45 compute-0 nova_compute[189265]: 2025-09-30 07:35:45.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:35:46 compute-0 nova_compute[189265]: 2025-09-30 07:35:46.228 2 DEBUG nova.scheduler.client.report [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:35:46 compute-0 podman[221082]: 2025-09-30 07:35:46.497747317 +0000 UTC m=+0.076910996 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=iscsid, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, tcib_build_tag=watcher_latest)
Sep 30 07:35:46 compute-0 nova_compute[189265]: 2025-09-30 07:35:46.773 2 DEBUG nova.compute.resource_tracker [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:35:46 compute-0 nova_compute[189265]: 2025-09-30 07:35:46.774 2 DEBUG oslo_concurrency.lockutils [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.935s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:35:46 compute-0 nova_compute[189265]: 2025-09-30 07:35:46.834 2 INFO nova.compute.manager [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Migrating instance to compute-1.ctlplane.example.com finished successfully.
Sep 30 07:35:47 compute-0 nova_compute[189265]: 2025-09-30 07:35:47.969 2 INFO nova.scheduler.client.report [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Deleted allocation for migration a52c5996-3eca-46c5-a9b4-2623668d7600
Sep 30 07:35:47 compute-0 nova_compute[189265]: 2025-09-30 07:35:47.969 2 DEBUG nova.virt.libvirt.driver [None req-ead8cabd-a97a-491c-b28e-17f0594d4b71 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 396e11ed-839f-4c1e-be94-410ca9634b50] Live migration monitoring is all done _live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11566
Sep 30 07:35:48 compute-0 nova_compute[189265]: 2025-09-30 07:35:48.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:35:48 compute-0 nova_compute[189265]: 2025-09-30 07:35:48.788 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:35:49 compute-0 nova_compute[189265]: 2025-09-30 07:35:49.070 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:35:49 compute-0 nova_compute[189265]: 2025-09-30 07:35:49.523 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:35:50 compute-0 podman[221103]: 2025-09-30 07:35:50.516707238 +0000 UTC m=+0.097236811 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, vcs-type=git, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc.)
Sep 30 07:35:54 compute-0 nova_compute[189265]: 2025-09-30 07:35:54.073 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:35:54 compute-0 podman[221124]: 2025-09-30 07:35:54.479312177 +0000 UTC m=+0.059578197 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Sep 30 07:35:54 compute-0 podman[221126]: 2025-09-30 07:35:54.496843142 +0000 UTC m=+0.080511160 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller)
Sep 30 07:35:54 compute-0 podman[221125]: 2025-09-30 07:35:54.504359978 +0000 UTC m=+0.085834183 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent)
Sep 30 07:35:54 compute-0 nova_compute[189265]: 2025-09-30 07:35:54.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:35:54 compute-0 nova_compute[189265]: 2025-09-30 07:35:54.789 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:35:55 compute-0 nova_compute[189265]: 2025-09-30 07:35:55.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:35:56 compute-0 nova_compute[189265]: 2025-09-30 07:35:56.783 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:35:57 compute-0 nova_compute[189265]: 2025-09-30 07:35:57.295 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:35:57 compute-0 nova_compute[189265]: 2025-09-30 07:35:57.807 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:35:57 compute-0 nova_compute[189265]: 2025-09-30 07:35:57.808 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:35:57 compute-0 nova_compute[189265]: 2025-09-30 07:35:57.808 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:35:57 compute-0 nova_compute[189265]: 2025-09-30 07:35:57.809 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:35:58 compute-0 nova_compute[189265]: 2025-09-30 07:35:58.047 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:35:58 compute-0 nova_compute[189265]: 2025-09-30 07:35:58.049 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:35:58 compute-0 nova_compute[189265]: 2025-09-30 07:35:58.075 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.026s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:35:58 compute-0 nova_compute[189265]: 2025-09-30 07:35:58.075 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5838MB free_disk=73.30377197265625GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:35:58 compute-0 nova_compute[189265]: 2025-09-30 07:35:58.076 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:35:58 compute-0 nova_compute[189265]: 2025-09-30 07:35:58.076 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:35:59 compute-0 nova_compute[189265]: 2025-09-30 07:35:59.074 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:35:59 compute-0 nova_compute[189265]: 2025-09-30 07:35:59.123 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:35:59 compute-0 nova_compute[189265]: 2025-09-30 07:35:59.123 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:35:58 up  1:33,  0 user,  load average: 0.41, 0.31, 0.30\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:35:59 compute-0 nova_compute[189265]: 2025-09-30 07:35:59.148 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:35:59 compute-0 nova_compute[189265]: 2025-09-30 07:35:59.528 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:35:59 compute-0 nova_compute[189265]: 2025-09-30 07:35:59.655 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:35:59 compute-0 podman[199733]: time="2025-09-30T07:35:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:35:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:35:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:35:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:35:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3013 "" "Go-http-client/1.1"
Sep 30 07:36:00 compute-0 nova_compute[189265]: 2025-09-30 07:36:00.174 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:36:00 compute-0 nova_compute[189265]: 2025-09-30 07:36:00.174 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.098s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:36:01 compute-0 openstack_network_exporter[201859]: ERROR   07:36:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:36:01 compute-0 openstack_network_exporter[201859]: ERROR   07:36:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:36:01 compute-0 openstack_network_exporter[201859]: ERROR   07:36:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:36:01 compute-0 openstack_network_exporter[201859]: ERROR   07:36:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:36:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:36:01 compute-0 openstack_network_exporter[201859]: ERROR   07:36:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:36:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:36:04 compute-0 nova_compute[189265]: 2025-09-30 07:36:04.076 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:36:04 compute-0 nova_compute[189265]: 2025-09-30 07:36:04.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:36:05 compute-0 nova_compute[189265]: 2025-09-30 07:36:05.669 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:36:07 compute-0 nova_compute[189265]: 2025-09-30 07:36:07.322 2 DEBUG nova.compute.manager [None req-d7a24583-a141-4251-8d78-0977f4a58bac bddd62d17bac483fb429dd18b1062646 4049964ce8244dacb50493f6676c6613 - - default default] Removing trait COMPUTE_STATUS_DISABLED from compute node resource provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc in placement. update_compute_provider_status /usr/lib/python3.12/site-packages/nova/compute/manager.py:631
Sep 30 07:36:07 compute-0 nova_compute[189265]: 2025-09-30 07:36:07.380 2 DEBUG nova.compute.provider_tree [None req-d7a24583-a141-4251-8d78-0977f4a58bac bddd62d17bac483fb429dd18b1062646 4049964ce8244dacb50493f6676c6613 - - default default] Updating resource provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc generation from 28 to 31 during operation: update_traits _update_generation /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:164
Sep 30 07:36:08 compute-0 podman[221191]: 2025-09-30 07:36:08.479700132 +0000 UTC m=+0.065358254 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 07:36:09 compute-0 nova_compute[189265]: 2025-09-30 07:36:09.078 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:36:09 compute-0 nova_compute[189265]: 2025-09-30 07:36:09.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:36:14 compute-0 nova_compute[189265]: 2025-09-30 07:36:14.081 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:36:14 compute-0 nova_compute[189265]: 2025-09-30 07:36:14.586 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:36:16 compute-0 unix_chkpwd[221217]: password check failed for user (root)
Sep 30 07:36:16 compute-0 sshd-session[221215]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=185.156.73.233  user=root
Sep 30 07:36:17 compute-0 podman[221218]: 2025-09-30 07:36:17.497194966 +0000 UTC m=+0.080630943 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_id=iscsid, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, container_name=iscsid)
Sep 30 07:36:18 compute-0 sshd-session[221215]: Failed password for root from 185.156.73.233 port 45108 ssh2
Sep 30 07:36:19 compute-0 nova_compute[189265]: 2025-09-30 07:36:19.083 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:36:19 compute-0 nova_compute[189265]: 2025-09-30 07:36:19.588 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:36:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:36:20.571 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:36:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:36:20.572 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:36:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:36:20.572 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:36:21 compute-0 sshd-session[221215]: Connection closed by authenticating user root 185.156.73.233 port 45108 [preauth]
Sep 30 07:36:21 compute-0 podman[221239]: 2025-09-30 07:36:21.5057981 +0000 UTC m=+0.086931474 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, managed_by=edpm_ansible, maintainer=Red Hat, Inc., name=ubi9-minimal, io.buildah.version=1.33.7, vcs-type=git, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.6, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Sep 30 07:36:24 compute-0 nova_compute[189265]: 2025-09-30 07:36:24.091 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:36:24 compute-0 nova_compute[189265]: 2025-09-30 07:36:24.625 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:36:25 compute-0 podman[221262]: 2025-09-30 07:36:25.498455286 +0000 UTC m=+0.075928288 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4)
Sep 30 07:36:25 compute-0 podman[221261]: 2025-09-30 07:36:25.523867938 +0000 UTC m=+0.095147732 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, container_name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Sep 30 07:36:25 compute-0 podman[221263]: 2025-09-30 07:36:25.569485911 +0000 UTC m=+0.132364113 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Sep 30 07:36:29 compute-0 nova_compute[189265]: 2025-09-30 07:36:29.087 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:36:29 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:36:29.095 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '1a:26:7c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2e:60:fa:91:d0:34'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:36:29 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:36:29.096 100322 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 07:36:29 compute-0 nova_compute[189265]: 2025-09-30 07:36:29.096 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:36:29 compute-0 nova_compute[189265]: 2025-09-30 07:36:29.626 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:36:29 compute-0 podman[199733]: time="2025-09-30T07:36:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:36:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:36:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:36:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:36:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3018 "" "Go-http-client/1.1"
Sep 30 07:36:31 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:36:31.098 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=01429670-4ea1-4dab-babc-4bc628cc01bb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:36:31 compute-0 openstack_network_exporter[201859]: ERROR   07:36:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:36:31 compute-0 openstack_network_exporter[201859]: ERROR   07:36:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:36:31 compute-0 openstack_network_exporter[201859]: ERROR   07:36:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:36:31 compute-0 openstack_network_exporter[201859]: ERROR   07:36:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:36:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:36:31 compute-0 openstack_network_exporter[201859]: ERROR   07:36:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:36:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:36:34 compute-0 nova_compute[189265]: 2025-09-30 07:36:34.089 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:36:34 compute-0 nova_compute[189265]: 2025-09-30 07:36:34.656 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:36:39 compute-0 nova_compute[189265]: 2025-09-30 07:36:39.094 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:36:39 compute-0 podman[221321]: 2025-09-30 07:36:39.496513254 +0000 UTC m=+0.078824961 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 07:36:39 compute-0 nova_compute[189265]: 2025-09-30 07:36:39.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:36:44 compute-0 nova_compute[189265]: 2025-09-30 07:36:44.095 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:36:44 compute-0 nova_compute[189265]: 2025-09-30 07:36:44.695 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:36:45 compute-0 nova_compute[189265]: 2025-09-30 07:36:45.783 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:36:47 compute-0 nova_compute[189265]: 2025-09-30 07:36:47.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:36:47 compute-0 nova_compute[189265]: 2025-09-30 07:36:47.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:36:48 compute-0 podman[221345]: 2025-09-30 07:36:48.47654699 +0000 UTC m=+0.055135119 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, config_id=iscsid, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 07:36:49 compute-0 nova_compute[189265]: 2025-09-30 07:36:49.097 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:36:49 compute-0 nova_compute[189265]: 2025-09-30 07:36:49.697 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:36:49 compute-0 nova_compute[189265]: 2025-09-30 07:36:49.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:36:49 compute-0 nova_compute[189265]: 2025-09-30 07:36:49.788 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:36:52 compute-0 podman[221365]: 2025-09-30 07:36:52.514417246 +0000 UTC m=+0.086620785 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, version=9.6, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_id=edpm, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, distribution-scope=public, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 07:36:54 compute-0 nova_compute[189265]: 2025-09-30 07:36:54.100 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:36:54 compute-0 nova_compute[189265]: 2025-09-30 07:36:54.699 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:36:55 compute-0 nova_compute[189265]: 2025-09-30 07:36:55.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:36:56 compute-0 podman[221387]: 2025-09-30 07:36:56.495480427 +0000 UTC m=+0.069769831 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4)
Sep 30 07:36:56 compute-0 podman[221386]: 2025-09-30 07:36:56.534424188 +0000 UTC m=+0.115826917 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20250930)
Sep 30 07:36:56 compute-0 podman[221388]: 2025-09-30 07:36:56.55672348 +0000 UTC m=+0.123590860 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 07:36:57 compute-0 nova_compute[189265]: 2025-09-30 07:36:57.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:36:57 compute-0 nova_compute[189265]: 2025-09-30 07:36:57.789 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:36:58 compute-0 nova_compute[189265]: 2025-09-30 07:36:58.303 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:36:58 compute-0 nova_compute[189265]: 2025-09-30 07:36:58.304 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:36:58 compute-0 nova_compute[189265]: 2025-09-30 07:36:58.305 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:36:58 compute-0 nova_compute[189265]: 2025-09-30 07:36:58.305 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:36:58 compute-0 nova_compute[189265]: 2025-09-30 07:36:58.499 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:36:58 compute-0 nova_compute[189265]: 2025-09-30 07:36:58.501 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:36:58 compute-0 nova_compute[189265]: 2025-09-30 07:36:58.526 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.025s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:36:58 compute-0 nova_compute[189265]: 2025-09-30 07:36:58.527 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5855MB free_disk=73.30387496948242GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:36:58 compute-0 nova_compute[189265]: 2025-09-30 07:36:58.528 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:36:58 compute-0 nova_compute[189265]: 2025-09-30 07:36:58.528 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:36:59 compute-0 nova_compute[189265]: 2025-09-30 07:36:59.100 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:36:59 compute-0 nova_compute[189265]: 2025-09-30 07:36:59.591 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:36:59 compute-0 nova_compute[189265]: 2025-09-30 07:36:59.592 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:36:58 up  1:34,  0 user,  load average: 0.32, 0.29, 0.29\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:36:59 compute-0 nova_compute[189265]: 2025-09-30 07:36:59.614 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:36:59 compute-0 nova_compute[189265]: 2025-09-30 07:36:59.732 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:36:59 compute-0 podman[199733]: time="2025-09-30T07:36:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:36:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:36:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:36:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:36:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3015 "" "Go-http-client/1.1"
Sep 30 07:37:00 compute-0 nova_compute[189265]: 2025-09-30 07:37:00.124 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:37:00 compute-0 nova_compute[189265]: 2025-09-30 07:37:00.638 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:37:00 compute-0 nova_compute[189265]: 2025-09-30 07:37:00.639 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.111s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:37:01 compute-0 openstack_network_exporter[201859]: ERROR   07:37:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:37:01 compute-0 openstack_network_exporter[201859]: ERROR   07:37:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:37:01 compute-0 openstack_network_exporter[201859]: ERROR   07:37:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:37:01 compute-0 openstack_network_exporter[201859]: ERROR   07:37:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:37:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:37:01 compute-0 openstack_network_exporter[201859]: ERROR   07:37:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:37:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:37:04 compute-0 nova_compute[189265]: 2025-09-30 07:37:04.102 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:37:04 compute-0 nova_compute[189265]: 2025-09-30 07:37:04.735 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:37:07 compute-0 nova_compute[189265]: 2025-09-30 07:37:07.639 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:37:09 compute-0 nova_compute[189265]: 2025-09-30 07:37:09.104 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:37:09 compute-0 nova_compute[189265]: 2025-09-30 07:37:09.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:37:10 compute-0 podman[221452]: 2025-09-30 07:37:10.482375771 +0000 UTC m=+0.061588914 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 07:37:14 compute-0 nova_compute[189265]: 2025-09-30 07:37:14.108 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:37:14 compute-0 nova_compute[189265]: 2025-09-30 07:37:14.799 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:37:19 compute-0 nova_compute[189265]: 2025-09-30 07:37:19.112 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:37:19 compute-0 podman[221476]: 2025-09-30 07:37:19.486066217 +0000 UTC m=+0.068384721 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=iscsid)
Sep 30 07:37:19 compute-0 nova_compute[189265]: 2025-09-30 07:37:19.801 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:37:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:37:20.574 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:37:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:37:20.574 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:37:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:37:20.574 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:37:21 compute-0 ovn_controller[91436]: 2025-09-30T07:37:21Z|00211|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory
Sep 30 07:37:23 compute-0 podman[221497]: 2025-09-30 07:37:23.472068969 +0000 UTC m=+0.060514594 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, distribution-scope=public, release=1755695350, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., managed_by=edpm_ansible, container_name=openstack_network_exporter, vcs-type=git, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Sep 30 07:37:23 compute-0 nova_compute[189265]: 2025-09-30 07:37:23.784 2 DEBUG nova.virt.libvirt.driver [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: f7d28008-e1b8-4a29-ad1f-86180635f5f0] Creating tmpfile /var/lib/nova/instances/tmp9ozox8_2 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Sep 30 07:37:23 compute-0 nova_compute[189265]: 2025-09-30 07:37:23.785 2 WARNING neutronclient.v2_0.client [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:37:23 compute-0 nova_compute[189265]: 2025-09-30 07:37:23.790 2 DEBUG nova.compute.manager [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp9ozox8_2',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Sep 30 07:37:23 compute-0 nova_compute[189265]: 2025-09-30 07:37:23.799 2 DEBUG nova.virt.libvirt.driver [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d5cc22f2-bd83-4bac-9ebe-9055fb0761c5] Creating tmpfile /var/lib/nova/instances/tmpg_duoa82 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Sep 30 07:37:23 compute-0 nova_compute[189265]: 2025-09-30 07:37:23.800 2 WARNING neutronclient.v2_0.client [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:37:23 compute-0 nova_compute[189265]: 2025-09-30 07:37:23.805 2 DEBUG nova.compute.manager [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpg_duoa82',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Sep 30 07:37:24 compute-0 nova_compute[189265]: 2025-09-30 07:37:24.112 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:37:24 compute-0 nova_compute[189265]: 2025-09-30 07:37:24.804 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:37:25 compute-0 nova_compute[189265]: 2025-09-30 07:37:25.897 2 WARNING neutronclient.v2_0.client [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:37:25 compute-0 nova_compute[189265]: 2025-09-30 07:37:25.902 2 WARNING neutronclient.v2_0.client [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:37:27 compute-0 podman[221518]: 2025-09-30 07:37:27.480090155 +0000 UTC m=+0.067163465 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Sep 30 07:37:27 compute-0 podman[221519]: 2025-09-30 07:37:27.480136737 +0000 UTC m=+0.063703996 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 07:37:27 compute-0 podman[221520]: 2025-09-30 07:37:27.581353542 +0000 UTC m=+0.165529508 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Sep 30 07:37:29 compute-0 nova_compute[189265]: 2025-09-30 07:37:29.113 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:37:29 compute-0 podman[199733]: time="2025-09-30T07:37:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:37:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:37:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:37:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:37:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3007 "" "Go-http-client/1.1"
Sep 30 07:37:29 compute-0 nova_compute[189265]: 2025-09-30 07:37:29.807 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:37:29 compute-0 nova_compute[189265]: 2025-09-30 07:37:29.966 2 DEBUG nova.compute.manager [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp9ozox8_2',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='f7d28008-e1b8-4a29-ad1f-86180635f5f0',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Sep 30 07:37:31 compute-0 nova_compute[189265]: 2025-09-30 07:37:31.011 2 DEBUG oslo_concurrency.lockutils [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "refresh_cache-f7d28008-e1b8-4a29-ad1f-86180635f5f0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:37:31 compute-0 nova_compute[189265]: 2025-09-30 07:37:31.012 2 DEBUG oslo_concurrency.lockutils [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquired lock "refresh_cache-f7d28008-e1b8-4a29-ad1f-86180635f5f0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:37:31 compute-0 nova_compute[189265]: 2025-09-30 07:37:31.013 2 DEBUG nova.network.neutron [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: f7d28008-e1b8-4a29-ad1f-86180635f5f0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 07:37:31 compute-0 openstack_network_exporter[201859]: ERROR   07:37:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:37:31 compute-0 openstack_network_exporter[201859]: ERROR   07:37:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:37:31 compute-0 openstack_network_exporter[201859]: ERROR   07:37:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:37:31 compute-0 openstack_network_exporter[201859]: ERROR   07:37:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:37:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:37:31 compute-0 openstack_network_exporter[201859]: ERROR   07:37:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:37:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:37:31 compute-0 nova_compute[189265]: 2025-09-30 07:37:31.519 2 WARNING neutronclient.v2_0.client [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:37:32 compute-0 nova_compute[189265]: 2025-09-30 07:37:32.261 2 WARNING neutronclient.v2_0.client [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:37:32 compute-0 nova_compute[189265]: 2025-09-30 07:37:32.430 2 DEBUG nova.network.neutron [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: f7d28008-e1b8-4a29-ad1f-86180635f5f0] Updating instance_info_cache with network_info: [{"id": "1338fa03-37bf-4505-8617-f839269e1887", "address": "fa:16:3e:28:0b:d5", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1338fa03-37", "ovs_interfaceid": "1338fa03-37bf-4505-8617-f839269e1887", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:37:32 compute-0 nova_compute[189265]: 2025-09-30 07:37:32.939 2 DEBUG oslo_concurrency.lockutils [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Releasing lock "refresh_cache-f7d28008-e1b8-4a29-ad1f-86180635f5f0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:37:33 compute-0 nova_compute[189265]: 2025-09-30 07:37:33.010 2 DEBUG nova.virt.libvirt.driver [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: f7d28008-e1b8-4a29-ad1f-86180635f5f0] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp9ozox8_2',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='f7d28008-e1b8-4a29-ad1f-86180635f5f0',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Sep 30 07:37:33 compute-0 nova_compute[189265]: 2025-09-30 07:37:33.011 2 DEBUG nova.virt.libvirt.driver [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: f7d28008-e1b8-4a29-ad1f-86180635f5f0] Creating instance directory: /var/lib/nova/instances/f7d28008-e1b8-4a29-ad1f-86180635f5f0 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Sep 30 07:37:33 compute-0 nova_compute[189265]: 2025-09-30 07:37:33.012 2 DEBUG nova.virt.libvirt.driver [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: f7d28008-e1b8-4a29-ad1f-86180635f5f0] Creating disk.info with the contents: {'/var/lib/nova/instances/f7d28008-e1b8-4a29-ad1f-86180635f5f0/disk': 'qcow2', '/var/lib/nova/instances/f7d28008-e1b8-4a29-ad1f-86180635f5f0/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Sep 30 07:37:33 compute-0 nova_compute[189265]: 2025-09-30 07:37:33.013 2 DEBUG nova.virt.libvirt.driver [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: f7d28008-e1b8-4a29-ad1f-86180635f5f0] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Sep 30 07:37:33 compute-0 nova_compute[189265]: 2025-09-30 07:37:33.014 2 DEBUG nova.objects.instance [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lazy-loading 'trusted_certs' on Instance uuid f7d28008-e1b8-4a29-ad1f-86180635f5f0 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:37:33 compute-0 nova_compute[189265]: 2025-09-30 07:37:33.530 2 DEBUG oslo_utils.imageutils.format_inspector [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:37:33 compute-0 nova_compute[189265]: 2025-09-30 07:37:33.537 2 DEBUG oslo_utils.imageutils.format_inspector [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:37:33 compute-0 nova_compute[189265]: 2025-09-30 07:37:33.539 2 DEBUG oslo_concurrency.processutils [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:37:33 compute-0 nova_compute[189265]: 2025-09-30 07:37:33.624 2 DEBUG oslo_concurrency.processutils [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:37:33 compute-0 nova_compute[189265]: 2025-09-30 07:37:33.624 2 DEBUG oslo_concurrency.lockutils [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "649c128805005f3dfb5a93843c58a367cdfe939d" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:37:33 compute-0 nova_compute[189265]: 2025-09-30 07:37:33.625 2 DEBUG oslo_concurrency.lockutils [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "649c128805005f3dfb5a93843c58a367cdfe939d" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:37:33 compute-0 nova_compute[189265]: 2025-09-30 07:37:33.625 2 DEBUG oslo_utils.imageutils.format_inspector [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:37:33 compute-0 nova_compute[189265]: 2025-09-30 07:37:33.628 2 DEBUG oslo_utils.imageutils.format_inspector [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:37:33 compute-0 nova_compute[189265]: 2025-09-30 07:37:33.628 2 DEBUG oslo_concurrency.processutils [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:37:33 compute-0 nova_compute[189265]: 2025-09-30 07:37:33.719 2 DEBUG oslo_concurrency.processutils [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:37:33 compute-0 nova_compute[189265]: 2025-09-30 07:37:33.720 2 DEBUG oslo_concurrency.processutils [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d,backing_fmt=raw /var/lib/nova/instances/f7d28008-e1b8-4a29-ad1f-86180635f5f0/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:37:33 compute-0 nova_compute[189265]: 2025-09-30 07:37:33.787 2 DEBUG oslo_concurrency.processutils [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d,backing_fmt=raw /var/lib/nova/instances/f7d28008-e1b8-4a29-ad1f-86180635f5f0/disk 1073741824" returned: 0 in 0.067s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:37:33 compute-0 nova_compute[189265]: 2025-09-30 07:37:33.788 2 DEBUG oslo_concurrency.lockutils [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "649c128805005f3dfb5a93843c58a367cdfe939d" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.163s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:37:33 compute-0 nova_compute[189265]: 2025-09-30 07:37:33.788 2 DEBUG oslo_concurrency.processutils [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:37:33 compute-0 nova_compute[189265]: 2025-09-30 07:37:33.852 2 DEBUG oslo_concurrency.processutils [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:37:33 compute-0 nova_compute[189265]: 2025-09-30 07:37:33.853 2 DEBUG nova.virt.disk.api [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Checking if we can resize image /var/lib/nova/instances/f7d28008-e1b8-4a29-ad1f-86180635f5f0/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Sep 30 07:37:33 compute-0 nova_compute[189265]: 2025-09-30 07:37:33.854 2 DEBUG oslo_concurrency.processutils [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f7d28008-e1b8-4a29-ad1f-86180635f5f0/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:37:33 compute-0 nova_compute[189265]: 2025-09-30 07:37:33.908 2 DEBUG oslo_concurrency.processutils [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f7d28008-e1b8-4a29-ad1f-86180635f5f0/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:37:33 compute-0 nova_compute[189265]: 2025-09-30 07:37:33.910 2 DEBUG nova.virt.disk.api [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Cannot resize image /var/lib/nova/instances/f7d28008-e1b8-4a29-ad1f-86180635f5f0/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Sep 30 07:37:33 compute-0 nova_compute[189265]: 2025-09-30 07:37:33.910 2 DEBUG nova.objects.instance [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lazy-loading 'migration_context' on Instance uuid f7d28008-e1b8-4a29-ad1f-86180635f5f0 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:37:34 compute-0 nova_compute[189265]: 2025-09-30 07:37:34.117 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:37:34 compute-0 nova_compute[189265]: 2025-09-30 07:37:34.418 2 DEBUG nova.objects.base [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Object Instance<f7d28008-e1b8-4a29-ad1f-86180635f5f0> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Sep 30 07:37:34 compute-0 nova_compute[189265]: 2025-09-30 07:37:34.419 2 DEBUG oslo_concurrency.processutils [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/f7d28008-e1b8-4a29-ad1f-86180635f5f0/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:37:34 compute-0 nova_compute[189265]: 2025-09-30 07:37:34.441 2 DEBUG oslo_concurrency.processutils [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/f7d28008-e1b8-4a29-ad1f-86180635f5f0/disk.config 497664" returned: 0 in 0.022s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:37:34 compute-0 nova_compute[189265]: 2025-09-30 07:37:34.442 2 DEBUG nova.virt.libvirt.driver [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: f7d28008-e1b8-4a29-ad1f-86180635f5f0] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Sep 30 07:37:34 compute-0 nova_compute[189265]: 2025-09-30 07:37:34.443 2 DEBUG nova.virt.libvirt.vif [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T07:36:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-291732864',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-291732864',id=23,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T07:36:53Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6431607f3dce4c88bbf6d17ee6cd45b2',ramdisk_id='',reservation_id='r-18wxf30x',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1096120513',owner_user_name='tempest-TestExecuteStrategies-1096120513-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T07:36:53Z,user_data=None,user_id='89ba5d19014145188ad2a3c812acdc88',uuid=f7d28008-e1b8-4a29-ad1f-86180635f5f0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1338fa03-37bf-4505-8617-f839269e1887", "address": "fa:16:3e:28:0b:d5", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap1338fa03-37", "ovs_interfaceid": "1338fa03-37bf-4505-8617-f839269e1887", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 07:37:34 compute-0 nova_compute[189265]: 2025-09-30 07:37:34.444 2 DEBUG nova.network.os_vif_util [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Converting VIF {"id": "1338fa03-37bf-4505-8617-f839269e1887", "address": "fa:16:3e:28:0b:d5", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap1338fa03-37", "ovs_interfaceid": "1338fa03-37bf-4505-8617-f839269e1887", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:37:34 compute-0 nova_compute[189265]: 2025-09-30 07:37:34.444 2 DEBUG nova.network.os_vif_util [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:28:0b:d5,bridge_name='br-int',has_traffic_filtering=True,id=1338fa03-37bf-4505-8617-f839269e1887,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1338fa03-37') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:37:34 compute-0 nova_compute[189265]: 2025-09-30 07:37:34.445 2 DEBUG os_vif [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:28:0b:d5,bridge_name='br-int',has_traffic_filtering=True,id=1338fa03-37bf-4505-8617-f839269e1887,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1338fa03-37') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 07:37:34 compute-0 nova_compute[189265]: 2025-09-30 07:37:34.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:37:34 compute-0 nova_compute[189265]: 2025-09-30 07:37:34.446 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:37:34 compute-0 nova_compute[189265]: 2025-09-30 07:37:34.446 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:37:34 compute-0 nova_compute[189265]: 2025-09-30 07:37:34.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:37:34 compute-0 nova_compute[189265]: 2025-09-30 07:37:34.447 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '9de4abe5-5ab6-5f52-813f-508632b6e8f8', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:37:34 compute-0 nova_compute[189265]: 2025-09-30 07:37:34.448 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:37:34 compute-0 nova_compute[189265]: 2025-09-30 07:37:34.450 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:37:34 compute-0 nova_compute[189265]: 2025-09-30 07:37:34.453 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:37:34 compute-0 nova_compute[189265]: 2025-09-30 07:37:34.453 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1338fa03-37, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:37:34 compute-0 nova_compute[189265]: 2025-09-30 07:37:34.454 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap1338fa03-37, col_values=(('qos', UUID('902ed515-5380-417d-8cbc-8201ca1f2c71')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:37:34 compute-0 nova_compute[189265]: 2025-09-30 07:37:34.454 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap1338fa03-37, col_values=(('external_ids', {'iface-id': '1338fa03-37bf-4505-8617-f839269e1887', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:28:0b:d5', 'vm-uuid': 'f7d28008-e1b8-4a29-ad1f-86180635f5f0'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:37:34 compute-0 nova_compute[189265]: 2025-09-30 07:37:34.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:37:34 compute-0 NetworkManager[51813]: <info>  [1759217854.4569] manager: (tap1338fa03-37): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/75)
Sep 30 07:37:34 compute-0 nova_compute[189265]: 2025-09-30 07:37:34.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:37:34 compute-0 nova_compute[189265]: 2025-09-30 07:37:34.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:37:34 compute-0 nova_compute[189265]: 2025-09-30 07:37:34.462 2 INFO os_vif [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:28:0b:d5,bridge_name='br-int',has_traffic_filtering=True,id=1338fa03-37bf-4505-8617-f839269e1887,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1338fa03-37')
Sep 30 07:37:34 compute-0 nova_compute[189265]: 2025-09-30 07:37:34.463 2 DEBUG nova.virt.libvirt.driver [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Sep 30 07:37:34 compute-0 nova_compute[189265]: 2025-09-30 07:37:34.463 2 DEBUG nova.compute.manager [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp9ozox8_2',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='f7d28008-e1b8-4a29-ad1f-86180635f5f0',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Sep 30 07:37:34 compute-0 nova_compute[189265]: 2025-09-30 07:37:34.464 2 WARNING neutronclient.v2_0.client [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:37:34 compute-0 nova_compute[189265]: 2025-09-30 07:37:34.637 2 WARNING neutronclient.v2_0.client [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:37:34 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:37:34.870 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '1a:26:7c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2e:60:fa:91:d0:34'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:37:34 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:37:34.871 100322 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 07:37:34 compute-0 nova_compute[189265]: 2025-09-30 07:37:34.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:37:35 compute-0 nova_compute[189265]: 2025-09-30 07:37:35.222 2 DEBUG nova.network.neutron [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: f7d28008-e1b8-4a29-ad1f-86180635f5f0] Port 1338fa03-37bf-4505-8617-f839269e1887 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Sep 30 07:37:35 compute-0 nova_compute[189265]: 2025-09-30 07:37:35.279 2 DEBUG nova.compute.manager [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp9ozox8_2',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='f7d28008-e1b8-4a29-ad1f-86180635f5f0',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Sep 30 07:37:38 compute-0 systemd[1]: Starting libvirt proxy daemon...
Sep 30 07:37:38 compute-0 systemd[1]: Started libvirt proxy daemon.
Sep 30 07:37:38 compute-0 kernel: tap1338fa03-37: entered promiscuous mode
Sep 30 07:37:38 compute-0 NetworkManager[51813]: <info>  [1759217858.9823] manager: (tap1338fa03-37): new Tun device (/org/freedesktop/NetworkManager/Devices/76)
Sep 30 07:37:38 compute-0 ovn_controller[91436]: 2025-09-30T07:37:38Z|00212|binding|INFO|Claiming lport 1338fa03-37bf-4505-8617-f839269e1887 for this additional chassis.
Sep 30 07:37:38 compute-0 ovn_controller[91436]: 2025-09-30T07:37:38Z|00213|binding|INFO|1338fa03-37bf-4505-8617-f839269e1887: Claiming fa:16:3e:28:0b:d5 10.100.0.14
Sep 30 07:37:38 compute-0 nova_compute[189265]: 2025-09-30 07:37:38.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:37:38 compute-0 ovn_controller[91436]: 2025-09-30T07:37:38Z|00214|binding|INFO|Setting lport 1338fa03-37bf-4505-8617-f839269e1887 ovn-installed in OVS
Sep 30 07:37:38 compute-0 nova_compute[189265]: 2025-09-30 07:37:38.999 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:37:39 compute-0 nova_compute[189265]: 2025-09-30 07:37:39.000 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:37:39 compute-0 systemd-udevd[221633]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 07:37:39 compute-0 NetworkManager[51813]: <info>  [1759217859.0220] device (tap1338fa03-37): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 07:37:39 compute-0 NetworkManager[51813]: <info>  [1759217859.0228] device (tap1338fa03-37): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 07:37:39 compute-0 systemd-machined[149233]: New machine qemu-17-instance-00000017.
Sep 30 07:37:39 compute-0 systemd[1]: Started Virtual Machine qemu-17-instance-00000017.
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:37:39.052 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:28:0b:d5 10.100.0.14'], port_security=['fa:16:3e:28:0b:d5 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'f7d28008-e1b8-4a29-ad1f-86180635f5f0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c99c822b-3191-49e5-b938-903e25b4a9bb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6431607f3dce4c88bbf6d17ee6cd45b2', 'neutron:revision_number': '10', 'neutron:security_group_ids': '39e9818d-6ede-4a3d-b6e2-a5ad3a4c803a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0bbcb02d-e040-4e0e-9a60-6466c4420133, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[], logical_port=1338fa03-37bf-4505-8617-f839269e1887) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:37:39.053 100322 INFO neutron.agent.ovn.metadata.agent [-] Port 1338fa03-37bf-4505-8617-f839269e1887 in datapath c99c822b-3191-49e5-b938-903e25b4a9bb unbound from our chassis
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:37:39.055 100322 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c99c822b-3191-49e5-b938-903e25b4a9bb
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:37:39.069 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[5bb7af6a-cab3-42d9-89ac-50a977bc6d06]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:37:39.070 100322 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc99c822b-31 in ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:37:39.072 210650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc99c822b-30 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:37:39.072 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[86c888cb-357d-4759-9c0a-5ab98664cc40]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:37:39.073 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[926bee6a-d41f-4cd5-8421-ea1951c095d9]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:37:39.083 100440 DEBUG oslo.privsep.daemon [-] privsep: reply[4d9ad024-4ef3-437b-8730-91568327e481]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:37:39.100 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[90dd2ee7-a622-4402-84c6-36afbf2af10a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:37:39 compute-0 nova_compute[189265]: 2025-09-30 07:37:39.117 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:37:39.134 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[c8ace720-5f1b-41d5-9a5d-af79a9692ae9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:37:39 compute-0 systemd-udevd[221636]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 07:37:39 compute-0 NetworkManager[51813]: <info>  [1759217859.1415] manager: (tapc99c822b-30): new Veth device (/org/freedesktop/NetworkManager/Devices/77)
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:37:39.141 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[b676e4b2-f720-4dea-9f44-7f93a8f3cc35]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:37:39.179 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[353ac299-144c-470d-a95a-d6a91ebb95ec]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:37:39.182 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[9b802c82-f32b-445c-9920-6b9e54b6b6c5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:37:39 compute-0 NetworkManager[51813]: <info>  [1759217859.2099] device (tapc99c822b-30): carrier: link connected
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:37:39.216 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[56fd2d94-da22-4510-9923-224cdbdb35bf]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:37:39.236 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[0fb94c63-7482-4af9-bcb2-fe6ea011b7a4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc99c822b-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:67:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 54], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 571693, 'reachable_time': 29932, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221667, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:37:39.254 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[5ef13d15-7dd7-4d6e-88b4-7a0eb464973b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe09:678c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 571693, 'tstamp': 571693}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221668, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:37:39.277 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[c2e2a5d8-a5c6-4073-accb-0d6f60c02b77]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc99c822b-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:67:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 54], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 571693, 'reachable_time': 29932, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221669, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:37:39.311 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[c35af293-21b5-4105-9a65-83b42e4a30ce]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:37:39.378 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[5d25c348-9b46-4655-9a1a-0f5e789cdba6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:37:39.380 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc99c822b-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:37:39.380 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:37:39.380 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc99c822b-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:37:39 compute-0 nova_compute[189265]: 2025-09-30 07:37:39.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:37:39 compute-0 NetworkManager[51813]: <info>  [1759217859.3822] manager: (tapc99c822b-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/78)
Sep 30 07:37:39 compute-0 kernel: tapc99c822b-30: entered promiscuous mode
Sep 30 07:37:39 compute-0 nova_compute[189265]: 2025-09-30 07:37:39.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:37:39.385 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc99c822b-30, col_values=(('external_ids', {'iface-id': '67b7df48-3f38-444a-8506-1c0ec5bd1d15'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:37:39 compute-0 nova_compute[189265]: 2025-09-30 07:37:39.386 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:37:39 compute-0 nova_compute[189265]: 2025-09-30 07:37:39.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:37:39.391 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[a80d1a33-d23a-40c1-9ea0-6730997ccc43]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:37:39.392 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:37:39.392 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:37:39.392 100322 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for c99c822b-3191-49e5-b938-903e25b4a9bb disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:37:39.393 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:37:39.393 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[f269f199-350b-4a6a-ba1e-3a4f4625dadb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:37:39.394 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:37:39.394 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[a5f0a0af-3d8e-40f3-a7bd-6b00f188b018]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:37:39.395 100322 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]: global
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]:     log         /dev/log local0 debug
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]:     log-tag     haproxy-metadata-proxy-c99c822b-3191-49e5-b938-903e25b4a9bb
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]:     user        root
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]:     group       root
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]:     maxconn     1024
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]:     pidfile     /var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]:     daemon
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]: 
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]: defaults
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]:     log global
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]:     mode http
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]:     option httplog
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]:     option dontlognull
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]:     option http-server-close
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]:     option forwardfor
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]:     retries                 3
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]:     timeout http-request    30s
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]:     timeout connect         30s
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]:     timeout client          32s
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]:     timeout server          32s
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]:     timeout http-keep-alive 30s
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]: 
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]: listen listener
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]:     bind 169.254.169.254:80
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]:     
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]: 
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]:     http-request add-header X-OVN-Network-ID c99c822b-3191-49e5-b938-903e25b4a9bb
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Sep 30 07:37:39 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:37:39.398 100322 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'env', 'PROCESS_TAG=haproxy-c99c822b-3191-49e5-b938-903e25b4a9bb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c99c822b-3191-49e5-b938-903e25b4a9bb.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Sep 30 07:37:39 compute-0 ovn_controller[91436]: 2025-09-30T07:37:39Z|00215|binding|INFO|Releasing lport 67b7df48-3f38-444a-8506-1c0ec5bd1d15 from this chassis (sb_readonly=0)
Sep 30 07:37:39 compute-0 nova_compute[189265]: 2025-09-30 07:37:39.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:37:39 compute-0 nova_compute[189265]: 2025-09-30 07:37:39.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:37:39 compute-0 podman[221701]: 2025-09-30 07:37:39.842430756 +0000 UTC m=+0.075276239 container create 90d5c23a0b523c0ca9eb1130892b65951d84d894b337556ec22aafa770c60772 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Sep 30 07:37:39 compute-0 systemd[1]: Started libpod-conmon-90d5c23a0b523c0ca9eb1130892b65951d84d894b337556ec22aafa770c60772.scope.
Sep 30 07:37:39 compute-0 podman[221701]: 2025-09-30 07:37:39.808767157 +0000 UTC m=+0.041612730 image pull eeebcc09bc72f81ab45f5ab87eb8f6a7b554b949227aeec082bdb0732754ddc8 38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 07:37:39 compute-0 systemd[1]: Started libcrun container.
Sep 30 07:37:39 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0fea9f5cfc6dbb2268ece17d8dbae9151c092b63da2c7b5cbc7e3c1307080c1a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 07:37:39 compute-0 podman[221701]: 2025-09-30 07:37:39.937560826 +0000 UTC m=+0.170406339 container init 90d5c23a0b523c0ca9eb1130892b65951d84d894b337556ec22aafa770c60772 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Sep 30 07:37:39 compute-0 podman[221701]: 2025-09-30 07:37:39.945019801 +0000 UTC m=+0.177865294 container start 90d5c23a0b523c0ca9eb1130892b65951d84d894b337556ec22aafa770c60772 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Sep 30 07:37:39 compute-0 neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb[221722]: [NOTICE]   (221727) : New worker (221729) forked
Sep 30 07:37:39 compute-0 neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb[221722]: [NOTICE]   (221727) : Loading success.
Sep 30 07:37:40 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:37:40.873 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=01429670-4ea1-4dab-babc-4bc628cc01bb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:37:41 compute-0 podman[221750]: 2025-09-30 07:37:41.516440317 +0000 UTC m=+0.076901056 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 07:37:44 compute-0 nova_compute[189265]: 2025-09-30 07:37:44.122 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:37:44 compute-0 ovn_controller[91436]: 2025-09-30T07:37:44Z|00216|binding|INFO|Claiming lport 1338fa03-37bf-4505-8617-f839269e1887 for this chassis.
Sep 30 07:37:44 compute-0 ovn_controller[91436]: 2025-09-30T07:37:44Z|00217|binding|INFO|1338fa03-37bf-4505-8617-f839269e1887: Claiming fa:16:3e:28:0b:d5 10.100.0.14
Sep 30 07:37:44 compute-0 ovn_controller[91436]: 2025-09-30T07:37:44Z|00218|binding|INFO|Setting lport 1338fa03-37bf-4505-8617-f839269e1887 up in Southbound
Sep 30 07:37:44 compute-0 nova_compute[189265]: 2025-09-30 07:37:44.456 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:37:46 compute-0 nova_compute[189265]: 2025-09-30 07:37:46.998 2 INFO nova.compute.manager [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: f7d28008-e1b8-4a29-ad1f-86180635f5f0] Post operation of migration started
Sep 30 07:37:47 compute-0 nova_compute[189265]: 2025-09-30 07:37:46.999 2 WARNING neutronclient.v2_0.client [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:37:47 compute-0 nova_compute[189265]: 2025-09-30 07:37:47.540 2 WARNING neutronclient.v2_0.client [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:37:47 compute-0 nova_compute[189265]: 2025-09-30 07:37:47.541 2 WARNING neutronclient.v2_0.client [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:37:47 compute-0 nova_compute[189265]: 2025-09-30 07:37:47.635 2 DEBUG oslo_concurrency.lockutils [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "refresh_cache-f7d28008-e1b8-4a29-ad1f-86180635f5f0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:37:47 compute-0 nova_compute[189265]: 2025-09-30 07:37:47.636 2 DEBUG oslo_concurrency.lockutils [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquired lock "refresh_cache-f7d28008-e1b8-4a29-ad1f-86180635f5f0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:37:47 compute-0 nova_compute[189265]: 2025-09-30 07:37:47.636 2 DEBUG nova.network.neutron [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: f7d28008-e1b8-4a29-ad1f-86180635f5f0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 07:37:47 compute-0 nova_compute[189265]: 2025-09-30 07:37:47.783 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:37:48 compute-0 nova_compute[189265]: 2025-09-30 07:37:48.142 2 WARNING neutronclient.v2_0.client [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:37:48 compute-0 nova_compute[189265]: 2025-09-30 07:37:48.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:37:49 compute-0 nova_compute[189265]: 2025-09-30 07:37:49.124 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:37:49 compute-0 nova_compute[189265]: 2025-09-30 07:37:49.458 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:37:49 compute-0 nova_compute[189265]: 2025-09-30 07:37:49.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:37:49 compute-0 nova_compute[189265]: 2025-09-30 07:37:49.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:37:49 compute-0 nova_compute[189265]: 2025-09-30 07:37:49.788 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:37:49 compute-0 nova_compute[189265]: 2025-09-30 07:37:49.903 2 WARNING neutronclient.v2_0.client [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:37:50 compute-0 nova_compute[189265]: 2025-09-30 07:37:50.145 2 DEBUG nova.network.neutron [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: f7d28008-e1b8-4a29-ad1f-86180635f5f0] Updating instance_info_cache with network_info: [{"id": "1338fa03-37bf-4505-8617-f839269e1887", "address": "fa:16:3e:28:0b:d5", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1338fa03-37", "ovs_interfaceid": "1338fa03-37bf-4505-8617-f839269e1887", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:37:50 compute-0 podman[221774]: 2025-09-30 07:37:50.504315326 +0000 UTC m=+0.081415526 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=iscsid, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Sep 30 07:37:50 compute-0 nova_compute[189265]: 2025-09-30 07:37:50.661 2 DEBUG oslo_concurrency.lockutils [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Releasing lock "refresh_cache-f7d28008-e1b8-4a29-ad1f-86180635f5f0" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:37:51 compute-0 nova_compute[189265]: 2025-09-30 07:37:51.221 2 DEBUG oslo_concurrency.lockutils [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:37:51 compute-0 nova_compute[189265]: 2025-09-30 07:37:51.222 2 DEBUG oslo_concurrency.lockutils [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:37:51 compute-0 nova_compute[189265]: 2025-09-30 07:37:51.222 2 DEBUG oslo_concurrency.lockutils [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:37:51 compute-0 nova_compute[189265]: 2025-09-30 07:37:51.228 2 INFO nova.virt.libvirt.driver [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: f7d28008-e1b8-4a29-ad1f-86180635f5f0] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Sep 30 07:37:51 compute-0 virtqemud[189090]: Domain id=17 name='instance-00000017' uuid=f7d28008-e1b8-4a29-ad1f-86180635f5f0 is tainted: custom-monitor
Sep 30 07:37:52 compute-0 nova_compute[189265]: 2025-09-30 07:37:52.238 2 INFO nova.virt.libvirt.driver [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: f7d28008-e1b8-4a29-ad1f-86180635f5f0] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Sep 30 07:37:53 compute-0 nova_compute[189265]: 2025-09-30 07:37:53.244 2 INFO nova.virt.libvirt.driver [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: f7d28008-e1b8-4a29-ad1f-86180635f5f0] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Sep 30 07:37:53 compute-0 nova_compute[189265]: 2025-09-30 07:37:53.248 2 DEBUG nova.compute.manager [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: f7d28008-e1b8-4a29-ad1f-86180635f5f0] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 07:37:53 compute-0 nova_compute[189265]: 2025-09-30 07:37:53.767 2 DEBUG nova.objects.instance [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: f7d28008-e1b8-4a29-ad1f-86180635f5f0] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Sep 30 07:37:54 compute-0 nova_compute[189265]: 2025-09-30 07:37:54.125 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:37:54 compute-0 nova_compute[189265]: 2025-09-30 07:37:54.460 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:37:54 compute-0 podman[221795]: 2025-09-30 07:37:54.498964978 +0000 UTC m=+0.083961829 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, managed_by=edpm_ansible, release=1755695350, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, distribution-scope=public, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, container_name=openstack_network_exporter, version=9.6, architecture=x86_64, io.openshift.expose-services=)
Sep 30 07:37:54 compute-0 nova_compute[189265]: 2025-09-30 07:37:54.802 2 WARNING neutronclient.v2_0.client [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:37:55 compute-0 nova_compute[189265]: 2025-09-30 07:37:55.258 2 WARNING neutronclient.v2_0.client [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:37:55 compute-0 nova_compute[189265]: 2025-09-30 07:37:55.259 2 WARNING neutronclient.v2_0.client [None req-23f1e019-d442-41fa-9071-f0069950d699 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:37:56 compute-0 nova_compute[189265]: 2025-09-30 07:37:56.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:37:58 compute-0 podman[221820]: 2025-09-30 07:37:58.512894256 +0000 UTC m=+0.077757851 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20250930, managed_by=edpm_ansible)
Sep 30 07:37:58 compute-0 podman[221821]: 2025-09-30 07:37:58.549371806 +0000 UTC m=+0.117460664 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 07:37:58 compute-0 podman[221819]: 2025-09-30 07:37:58.549883521 +0000 UTC m=+0.119611406 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 07:37:58 compute-0 nova_compute[189265]: 2025-09-30 07:37:58.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:37:58 compute-0 nova_compute[189265]: 2025-09-30 07:37:58.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:37:59 compute-0 nova_compute[189265]: 2025-09-30 07:37:59.126 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:37:59 compute-0 nova_compute[189265]: 2025-09-30 07:37:59.311 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:37:59 compute-0 nova_compute[189265]: 2025-09-30 07:37:59.312 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:37:59 compute-0 nova_compute[189265]: 2025-09-30 07:37:59.312 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:37:59 compute-0 nova_compute[189265]: 2025-09-30 07:37:59.312 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:37:59 compute-0 nova_compute[189265]: 2025-09-30 07:37:59.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:37:59 compute-0 podman[199733]: time="2025-09-30T07:37:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:37:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:37:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 07:37:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:37:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3478 "" "Go-http-client/1.1"
Sep 30 07:38:00 compute-0 nova_compute[189265]: 2025-09-30 07:38:00.400 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f7d28008-e1b8-4a29-ad1f-86180635f5f0/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:38:00 compute-0 nova_compute[189265]: 2025-09-30 07:38:00.453 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f7d28008-e1b8-4a29-ad1f-86180635f5f0/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:38:00 compute-0 nova_compute[189265]: 2025-09-30 07:38:00.454 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f7d28008-e1b8-4a29-ad1f-86180635f5f0/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:38:00 compute-0 nova_compute[189265]: 2025-09-30 07:38:00.505 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f7d28008-e1b8-4a29-ad1f-86180635f5f0/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:38:00 compute-0 nova_compute[189265]: 2025-09-30 07:38:00.637 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:38:00 compute-0 nova_compute[189265]: 2025-09-30 07:38:00.638 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:38:00 compute-0 nova_compute[189265]: 2025-09-30 07:38:00.669 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.031s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:38:00 compute-0 nova_compute[189265]: 2025-09-30 07:38:00.669 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5687MB free_disk=73.27489471435547GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:38:00 compute-0 nova_compute[189265]: 2025-09-30 07:38:00.670 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:38:00 compute-0 nova_compute[189265]: 2025-09-30 07:38:00.670 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:38:01 compute-0 openstack_network_exporter[201859]: ERROR   07:38:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:38:01 compute-0 openstack_network_exporter[201859]: ERROR   07:38:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:38:01 compute-0 openstack_network_exporter[201859]: ERROR   07:38:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:38:01 compute-0 openstack_network_exporter[201859]: ERROR   07:38:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:38:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:38:01 compute-0 openstack_network_exporter[201859]: ERROR   07:38:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:38:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:38:01 compute-0 nova_compute[189265]: 2025-09-30 07:38:01.689 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Applying migration context for instance f7d28008-e1b8-4a29-ad1f-86180635f5f0 as it has an incoming, in-progress migration 87022c68-5934-4149-8ba5-6106b6d1c692. Migration status is running _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1046
Sep 30 07:38:01 compute-0 nova_compute[189265]: 2025-09-30 07:38:01.690 2 DEBUG nova.objects.instance [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] [instance: f7d28008-e1b8-4a29-ad1f-86180635f5f0] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Sep 30 07:38:02 compute-0 nova_compute[189265]: 2025-09-30 07:38:02.197 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Migration for instance d5cc22f2-bd83-4bac-9ebe-9055fb0761c5 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:979
Sep 30 07:38:02 compute-0 nova_compute[189265]: 2025-09-30 07:38:02.707 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] [instance: f7d28008-e1b8-4a29-ad1f-86180635f5f0] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1596
Sep 30 07:38:02 compute-0 nova_compute[189265]: 2025-09-30 07:38:02.708 2 INFO nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] [instance: d5cc22f2-bd83-4bac-9ebe-9055fb0761c5] Updating resource usage from migration e9fbdf35-6baa-4a41-b342-45620b4277d1
Sep 30 07:38:02 compute-0 nova_compute[189265]: 2025-09-30 07:38:02.709 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] [instance: d5cc22f2-bd83-4bac-9ebe-9055fb0761c5] Starting to track incoming migration e9fbdf35-6baa-4a41-b342-45620b4277d1 with flavor ded17455-f8fe-40c7-8dae-6f0a2b208ae0 _update_usage_from_migration /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1536
Sep 30 07:38:03 compute-0 nova_compute[189265]: 2025-09-30 07:38:03.263 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Instance f7d28008-e1b8-4a29-ad1f-86180635f5f0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 07:38:03 compute-0 nova_compute[189265]: 2025-09-30 07:38:03.758 2 DEBUG nova.compute.manager [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpg_duoa82',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='d5cc22f2-bd83-4bac-9ebe-9055fb0761c5',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Sep 30 07:38:03 compute-0 nova_compute[189265]: 2025-09-30 07:38:03.768 2 WARNING nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Instance d5cc22f2-bd83-4bac-9ebe-9055fb0761c5 has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Sep 30 07:38:03 compute-0 nova_compute[189265]: 2025-09-30 07:38:03.768 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:38:03 compute-0 nova_compute[189265]: 2025-09-30 07:38:03.768 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:38:00 up  1:35,  0 user,  load average: 0.40, 0.30, 0.29\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_6431607f3dce4c88bbf6d17ee6cd45b2': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:38:03 compute-0 nova_compute[189265]: 2025-09-30 07:38:03.825 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:38:04 compute-0 nova_compute[189265]: 2025-09-30 07:38:04.129 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:04 compute-0 nova_compute[189265]: 2025-09-30 07:38:04.374 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:38:04 compute-0 nova_compute[189265]: 2025-09-30 07:38:04.464 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:04 compute-0 nova_compute[189265]: 2025-09-30 07:38:04.801 2 DEBUG oslo_concurrency.lockutils [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "refresh_cache-d5cc22f2-bd83-4bac-9ebe-9055fb0761c5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:38:04 compute-0 nova_compute[189265]: 2025-09-30 07:38:04.802 2 DEBUG oslo_concurrency.lockutils [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquired lock "refresh_cache-d5cc22f2-bd83-4bac-9ebe-9055fb0761c5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:38:04 compute-0 nova_compute[189265]: 2025-09-30 07:38:04.802 2 DEBUG nova.network.neutron [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d5cc22f2-bd83-4bac-9ebe-9055fb0761c5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 07:38:04 compute-0 nova_compute[189265]: 2025-09-30 07:38:04.890 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:38:04 compute-0 nova_compute[189265]: 2025-09-30 07:38:04.891 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 4.221s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:38:05 compute-0 nova_compute[189265]: 2025-09-30 07:38:05.321 2 WARNING neutronclient.v2_0.client [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:38:06 compute-0 nova_compute[189265]: 2025-09-30 07:38:06.902 2 WARNING neutronclient.v2_0.client [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:38:07 compute-0 nova_compute[189265]: 2025-09-30 07:38:07.138 2 DEBUG nova.network.neutron [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d5cc22f2-bd83-4bac-9ebe-9055fb0761c5] Updating instance_info_cache with network_info: [{"id": "a74bc9cb-9db6-433c-8dd4-31b19f4a26c7", "address": "fa:16:3e:1c:98:63", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa74bc9cb-9d", "ovs_interfaceid": "a74bc9cb-9db6-433c-8dd4-31b19f4a26c7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:38:07 compute-0 nova_compute[189265]: 2025-09-30 07:38:07.657 2 DEBUG oslo_concurrency.lockutils [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Releasing lock "refresh_cache-d5cc22f2-bd83-4bac-9ebe-9055fb0761c5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:38:07 compute-0 nova_compute[189265]: 2025-09-30 07:38:07.698 2 DEBUG nova.virt.libvirt.driver [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d5cc22f2-bd83-4bac-9ebe-9055fb0761c5] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpg_duoa82',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='d5cc22f2-bd83-4bac-9ebe-9055fb0761c5',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Sep 30 07:38:07 compute-0 nova_compute[189265]: 2025-09-30 07:38:07.699 2 DEBUG nova.virt.libvirt.driver [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d5cc22f2-bd83-4bac-9ebe-9055fb0761c5] Creating instance directory: /var/lib/nova/instances/d5cc22f2-bd83-4bac-9ebe-9055fb0761c5 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Sep 30 07:38:07 compute-0 nova_compute[189265]: 2025-09-30 07:38:07.700 2 DEBUG nova.virt.libvirt.driver [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d5cc22f2-bd83-4bac-9ebe-9055fb0761c5] Creating disk.info with the contents: {'/var/lib/nova/instances/d5cc22f2-bd83-4bac-9ebe-9055fb0761c5/disk': 'qcow2', '/var/lib/nova/instances/d5cc22f2-bd83-4bac-9ebe-9055fb0761c5/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Sep 30 07:38:07 compute-0 nova_compute[189265]: 2025-09-30 07:38:07.700 2 DEBUG nova.virt.libvirt.driver [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d5cc22f2-bd83-4bac-9ebe-9055fb0761c5] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Sep 30 07:38:07 compute-0 nova_compute[189265]: 2025-09-30 07:38:07.701 2 DEBUG nova.objects.instance [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lazy-loading 'trusted_certs' on Instance uuid d5cc22f2-bd83-4bac-9ebe-9055fb0761c5 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:38:07 compute-0 nova_compute[189265]: 2025-09-30 07:38:07.886 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:38:08 compute-0 nova_compute[189265]: 2025-09-30 07:38:08.219 2 DEBUG oslo_utils.imageutils.format_inspector [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:38:08 compute-0 nova_compute[189265]: 2025-09-30 07:38:08.225 2 DEBUG oslo_utils.imageutils.format_inspector [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:38:08 compute-0 nova_compute[189265]: 2025-09-30 07:38:08.227 2 DEBUG oslo_concurrency.processutils [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:38:08 compute-0 nova_compute[189265]: 2025-09-30 07:38:08.318 2 DEBUG oslo_concurrency.processutils [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:38:08 compute-0 nova_compute[189265]: 2025-09-30 07:38:08.319 2 DEBUG oslo_concurrency.lockutils [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "649c128805005f3dfb5a93843c58a367cdfe939d" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:38:08 compute-0 nova_compute[189265]: 2025-09-30 07:38:08.320 2 DEBUG oslo_concurrency.lockutils [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "649c128805005f3dfb5a93843c58a367cdfe939d" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:38:08 compute-0 nova_compute[189265]: 2025-09-30 07:38:08.322 2 DEBUG oslo_utils.imageutils.format_inspector [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:38:08 compute-0 nova_compute[189265]: 2025-09-30 07:38:08.328 2 DEBUG oslo_utils.imageutils.format_inspector [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:38:08 compute-0 nova_compute[189265]: 2025-09-30 07:38:08.329 2 DEBUG oslo_concurrency.processutils [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:38:08 compute-0 nova_compute[189265]: 2025-09-30 07:38:08.399 2 DEBUG oslo_concurrency.processutils [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:38:08 compute-0 nova_compute[189265]: 2025-09-30 07:38:08.400 2 DEBUG oslo_concurrency.processutils [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d,backing_fmt=raw /var/lib/nova/instances/d5cc22f2-bd83-4bac-9ebe-9055fb0761c5/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:38:08 compute-0 nova_compute[189265]: 2025-09-30 07:38:08.411 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:38:08 compute-0 nova_compute[189265]: 2025-09-30 07:38:08.435 2 DEBUG oslo_concurrency.processutils [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d,backing_fmt=raw /var/lib/nova/instances/d5cc22f2-bd83-4bac-9ebe-9055fb0761c5/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:38:08 compute-0 nova_compute[189265]: 2025-09-30 07:38:08.436 2 DEBUG oslo_concurrency.lockutils [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "649c128805005f3dfb5a93843c58a367cdfe939d" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.116s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:38:08 compute-0 nova_compute[189265]: 2025-09-30 07:38:08.437 2 DEBUG oslo_concurrency.processutils [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:38:08 compute-0 nova_compute[189265]: 2025-09-30 07:38:08.489 2 DEBUG oslo_concurrency.processutils [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:38:08 compute-0 nova_compute[189265]: 2025-09-30 07:38:08.491 2 DEBUG nova.virt.disk.api [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Checking if we can resize image /var/lib/nova/instances/d5cc22f2-bd83-4bac-9ebe-9055fb0761c5/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Sep 30 07:38:08 compute-0 nova_compute[189265]: 2025-09-30 07:38:08.491 2 DEBUG oslo_concurrency.processutils [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d5cc22f2-bd83-4bac-9ebe-9055fb0761c5/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:38:08 compute-0 nova_compute[189265]: 2025-09-30 07:38:08.545 2 DEBUG oslo_concurrency.processutils [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d5cc22f2-bd83-4bac-9ebe-9055fb0761c5/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:38:08 compute-0 nova_compute[189265]: 2025-09-30 07:38:08.548 2 DEBUG nova.virt.disk.api [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Cannot resize image /var/lib/nova/instances/d5cc22f2-bd83-4bac-9ebe-9055fb0761c5/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Sep 30 07:38:08 compute-0 nova_compute[189265]: 2025-09-30 07:38:08.549 2 DEBUG nova.objects.instance [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lazy-loading 'migration_context' on Instance uuid d5cc22f2-bd83-4bac-9ebe-9055fb0761c5 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:38:09 compute-0 nova_compute[189265]: 2025-09-30 07:38:09.057 2 DEBUG nova.objects.base [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Object Instance<d5cc22f2-bd83-4bac-9ebe-9055fb0761c5> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Sep 30 07:38:09 compute-0 nova_compute[189265]: 2025-09-30 07:38:09.058 2 DEBUG oslo_concurrency.processutils [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/d5cc22f2-bd83-4bac-9ebe-9055fb0761c5/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:38:09 compute-0 nova_compute[189265]: 2025-09-30 07:38:09.095 2 DEBUG oslo_concurrency.processutils [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/d5cc22f2-bd83-4bac-9ebe-9055fb0761c5/disk.config 497664" returned: 0 in 0.037s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:38:09 compute-0 nova_compute[189265]: 2025-09-30 07:38:09.096 2 DEBUG nova.virt.libvirt.driver [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d5cc22f2-bd83-4bac-9ebe-9055fb0761c5] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Sep 30 07:38:09 compute-0 nova_compute[189265]: 2025-09-30 07:38:09.097 2 DEBUG nova.virt.libvirt.vif [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T07:36:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1166559026',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1166559026',id=22,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T07:36:30Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6431607f3dce4c88bbf6d17ee6cd45b2',ramdisk_id='',reservation_id='r-tm712ta8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1096120513',owner_user_name='tempest-TestExecuteStrategies-1096120513-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T07:36:30Z,user_data=None,user_id='89ba5d19014145188ad2a3c812acdc88',uuid=d5cc22f2-bd83-4bac-9ebe-9055fb0761c5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a74bc9cb-9db6-433c-8dd4-31b19f4a26c7", "address": "fa:16:3e:1c:98:63", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapa74bc9cb-9d", "ovs_interfaceid": "a74bc9cb-9db6-433c-8dd4-31b19f4a26c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 07:38:09 compute-0 nova_compute[189265]: 2025-09-30 07:38:09.097 2 DEBUG nova.network.os_vif_util [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Converting VIF {"id": "a74bc9cb-9db6-433c-8dd4-31b19f4a26c7", "address": "fa:16:3e:1c:98:63", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapa74bc9cb-9d", "ovs_interfaceid": "a74bc9cb-9db6-433c-8dd4-31b19f4a26c7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:38:09 compute-0 nova_compute[189265]: 2025-09-30 07:38:09.098 2 DEBUG nova.network.os_vif_util [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1c:98:63,bridge_name='br-int',has_traffic_filtering=True,id=a74bc9cb-9db6-433c-8dd4-31b19f4a26c7,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa74bc9cb-9d') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:38:09 compute-0 nova_compute[189265]: 2025-09-30 07:38:09.099 2 DEBUG os_vif [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1c:98:63,bridge_name='br-int',has_traffic_filtering=True,id=a74bc9cb-9db6-433c-8dd4-31b19f4a26c7,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa74bc9cb-9d') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 07:38:09 compute-0 nova_compute[189265]: 2025-09-30 07:38:09.099 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:09 compute-0 nova_compute[189265]: 2025-09-30 07:38:09.100 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:38:09 compute-0 nova_compute[189265]: 2025-09-30 07:38:09.100 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:38:09 compute-0 nova_compute[189265]: 2025-09-30 07:38:09.101 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:09 compute-0 nova_compute[189265]: 2025-09-30 07:38:09.102 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'a81469c7-6d6b-5fee-95e3-5064d7137837', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:38:09 compute-0 nova_compute[189265]: 2025-09-30 07:38:09.103 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:09 compute-0 nova_compute[189265]: 2025-09-30 07:38:09.105 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:38:09 compute-0 nova_compute[189265]: 2025-09-30 07:38:09.105 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:09 compute-0 nova_compute[189265]: 2025-09-30 07:38:09.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:09 compute-0 nova_compute[189265]: 2025-09-30 07:38:09.108 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa74bc9cb-9d, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:38:09 compute-0 nova_compute[189265]: 2025-09-30 07:38:09.108 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapa74bc9cb-9d, col_values=(('qos', UUID('2917a55b-d5e3-4e3e-9b71-e08e6f916e0c')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:38:09 compute-0 nova_compute[189265]: 2025-09-30 07:38:09.108 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapa74bc9cb-9d, col_values=(('external_ids', {'iface-id': 'a74bc9cb-9db6-433c-8dd4-31b19f4a26c7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1c:98:63', 'vm-uuid': 'd5cc22f2-bd83-4bac-9ebe-9055fb0761c5'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:38:09 compute-0 nova_compute[189265]: 2025-09-30 07:38:09.110 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:09 compute-0 NetworkManager[51813]: <info>  [1759217889.1111] manager: (tapa74bc9cb-9d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/79)
Sep 30 07:38:09 compute-0 nova_compute[189265]: 2025-09-30 07:38:09.112 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:38:09 compute-0 nova_compute[189265]: 2025-09-30 07:38:09.116 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:09 compute-0 nova_compute[189265]: 2025-09-30 07:38:09.116 2 INFO os_vif [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1c:98:63,bridge_name='br-int',has_traffic_filtering=True,id=a74bc9cb-9db6-433c-8dd4-31b19f4a26c7,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa74bc9cb-9d')
Sep 30 07:38:09 compute-0 nova_compute[189265]: 2025-09-30 07:38:09.117 2 DEBUG nova.virt.libvirt.driver [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Sep 30 07:38:09 compute-0 nova_compute[189265]: 2025-09-30 07:38:09.117 2 DEBUG nova.compute.manager [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpg_duoa82',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='d5cc22f2-bd83-4bac-9ebe-9055fb0761c5',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Sep 30 07:38:09 compute-0 nova_compute[189265]: 2025-09-30 07:38:09.118 2 WARNING neutronclient.v2_0.client [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:38:09 compute-0 nova_compute[189265]: 2025-09-30 07:38:09.129 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:09 compute-0 nova_compute[189265]: 2025-09-30 07:38:09.225 2 WARNING neutronclient.v2_0.client [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:38:09 compute-0 nova_compute[189265]: 2025-09-30 07:38:09.729 2 DEBUG nova.network.neutron [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d5cc22f2-bd83-4bac-9ebe-9055fb0761c5] Port a74bc9cb-9db6-433c-8dd4-31b19f4a26c7 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Sep 30 07:38:09 compute-0 nova_compute[189265]: 2025-09-30 07:38:09.748 2 DEBUG nova.compute.manager [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpg_duoa82',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='d5cc22f2-bd83-4bac-9ebe-9055fb0761c5',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Sep 30 07:38:12 compute-0 podman[221914]: 2025-09-30 07:38:12.488749769 +0000 UTC m=+0.066447875 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 07:38:13 compute-0 kernel: tapa74bc9cb-9d: entered promiscuous mode
Sep 30 07:38:13 compute-0 NetworkManager[51813]: <info>  [1759217893.3089] manager: (tapa74bc9cb-9d): new Tun device (/org/freedesktop/NetworkManager/Devices/80)
Sep 30 07:38:13 compute-0 ovn_controller[91436]: 2025-09-30T07:38:13Z|00219|binding|INFO|Claiming lport a74bc9cb-9db6-433c-8dd4-31b19f4a26c7 for this additional chassis.
Sep 30 07:38:13 compute-0 ovn_controller[91436]: 2025-09-30T07:38:13Z|00220|binding|INFO|a74bc9cb-9db6-433c-8dd4-31b19f4a26c7: Claiming fa:16:3e:1c:98:63 10.100.0.9
Sep 30 07:38:13 compute-0 nova_compute[189265]: 2025-09-30 07:38:13.311 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:38:13.324 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1c:98:63 10.100.0.9'], port_security=['fa:16:3e:1c:98:63 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'd5cc22f2-bd83-4bac-9ebe-9055fb0761c5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c99c822b-3191-49e5-b938-903e25b4a9bb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6431607f3dce4c88bbf6d17ee6cd45b2', 'neutron:revision_number': '10', 'neutron:security_group_ids': '39e9818d-6ede-4a3d-b6e2-a5ad3a4c803a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0bbcb02d-e040-4e0e-9a60-6466c4420133, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=a74bc9cb-9db6-433c-8dd4-31b19f4a26c7) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:38:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:38:13.327 100322 INFO neutron.agent.ovn.metadata.agent [-] Port a74bc9cb-9db6-433c-8dd4-31b19f4a26c7 in datapath c99c822b-3191-49e5-b938-903e25b4a9bb unbound from our chassis
Sep 30 07:38:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:38:13.329 100322 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c99c822b-3191-49e5-b938-903e25b4a9bb
Sep 30 07:38:13 compute-0 ovn_controller[91436]: 2025-09-30T07:38:13Z|00221|binding|INFO|Setting lport a74bc9cb-9db6-433c-8dd4-31b19f4a26c7 ovn-installed in OVS
Sep 30 07:38:13 compute-0 nova_compute[189265]: 2025-09-30 07:38:13.341 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:13 compute-0 nova_compute[189265]: 2025-09-30 07:38:13.343 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:13 compute-0 nova_compute[189265]: 2025-09-30 07:38:13.346 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:38:13.356 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[7909ae17-7200-426a-801f-7b2fdca3f169]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:38:13 compute-0 systemd-udevd[221952]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 07:38:13 compute-0 systemd-machined[149233]: New machine qemu-18-instance-00000016.
Sep 30 07:38:13 compute-0 NetworkManager[51813]: <info>  [1759217893.3827] device (tapa74bc9cb-9d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 07:38:13 compute-0 NetworkManager[51813]: <info>  [1759217893.3843] device (tapa74bc9cb-9d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 07:38:13 compute-0 systemd[1]: Started Virtual Machine qemu-18-instance-00000016.
Sep 30 07:38:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:38:13.397 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[ffd691bc-7b63-4e33-84e0-43deb81242fd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:38:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:38:13.400 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[a6709d49-c08f-4723-aa72-c206b678df31]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:38:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:38:13.436 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[432b4147-cc06-4041-aaed-99e120c9de1c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:38:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:38:13.453 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[621db3b9-7b9e-4466-9043-eefdc53c3d2e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc99c822b-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:67:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 5, 'rx_bytes': 1672, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 5, 'rx_bytes': 1672, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 54], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 571693, 'reachable_time': 29932, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221963, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:38:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:38:13.469 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[c0399e70-76fd-4712-ae1b-27a2267ab42a]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapc99c822b-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 571706, 'tstamp': 571706}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221966, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapc99c822b-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 571709, 'tstamp': 571709}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221966, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:38:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:38:13.471 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc99c822b-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:38:13 compute-0 nova_compute[189265]: 2025-09-30 07:38:13.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:13 compute-0 nova_compute[189265]: 2025-09-30 07:38:13.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:38:13.474 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc99c822b-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:38:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:38:13.474 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:38:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:38:13.474 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc99c822b-30, col_values=(('external_ids', {'iface-id': '67b7df48-3f38-444a-8506-1c0ec5bd1d15'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:38:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:38:13.474 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:38:13 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:38:13.475 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[d013ffc5-d57e-4936-99d4-940b10db5a7f]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-c99c822b-3191-49e5-b938-903e25b4a9bb\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID c99c822b-3191-49e5-b938-903e25b4a9bb\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:38:14 compute-0 nova_compute[189265]: 2025-09-30 07:38:14.110 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:14 compute-0 nova_compute[189265]: 2025-09-30 07:38:14.131 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:16 compute-0 ovn_controller[91436]: 2025-09-30T07:38:16Z|00222|binding|INFO|Claiming lport a74bc9cb-9db6-433c-8dd4-31b19f4a26c7 for this chassis.
Sep 30 07:38:16 compute-0 ovn_controller[91436]: 2025-09-30T07:38:16Z|00223|binding|INFO|a74bc9cb-9db6-433c-8dd4-31b19f4a26c7: Claiming fa:16:3e:1c:98:63 10.100.0.9
Sep 30 07:38:16 compute-0 ovn_controller[91436]: 2025-09-30T07:38:16Z|00224|binding|INFO|Setting lport a74bc9cb-9db6-433c-8dd4-31b19f4a26c7 up in Southbound
Sep 30 07:38:18 compute-0 nova_compute[189265]: 2025-09-30 07:38:18.136 2 INFO nova.compute.manager [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d5cc22f2-bd83-4bac-9ebe-9055fb0761c5] Post operation of migration started
Sep 30 07:38:18 compute-0 nova_compute[189265]: 2025-09-30 07:38:18.137 2 WARNING neutronclient.v2_0.client [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:38:18 compute-0 nova_compute[189265]: 2025-09-30 07:38:18.935 2 WARNING neutronclient.v2_0.client [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:38:18 compute-0 nova_compute[189265]: 2025-09-30 07:38:18.935 2 WARNING neutronclient.v2_0.client [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:38:19 compute-0 nova_compute[189265]: 2025-09-30 07:38:19.111 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:19 compute-0 nova_compute[189265]: 2025-09-30 07:38:19.133 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:19 compute-0 nova_compute[189265]: 2025-09-30 07:38:19.244 2 DEBUG oslo_concurrency.lockutils [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "refresh_cache-d5cc22f2-bd83-4bac-9ebe-9055fb0761c5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:38:19 compute-0 nova_compute[189265]: 2025-09-30 07:38:19.244 2 DEBUG oslo_concurrency.lockutils [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquired lock "refresh_cache-d5cc22f2-bd83-4bac-9ebe-9055fb0761c5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:38:19 compute-0 nova_compute[189265]: 2025-09-30 07:38:19.245 2 DEBUG nova.network.neutron [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d5cc22f2-bd83-4bac-9ebe-9055fb0761c5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 07:38:19 compute-0 nova_compute[189265]: 2025-09-30 07:38:19.755 2 WARNING neutronclient.v2_0.client [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:38:20 compute-0 nova_compute[189265]: 2025-09-30 07:38:20.244 2 WARNING neutronclient.v2_0.client [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:38:20 compute-0 nova_compute[189265]: 2025-09-30 07:38:20.430 2 DEBUG nova.network.neutron [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d5cc22f2-bd83-4bac-9ebe-9055fb0761c5] Updating instance_info_cache with network_info: [{"id": "a74bc9cb-9db6-433c-8dd4-31b19f4a26c7", "address": "fa:16:3e:1c:98:63", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa74bc9cb-9d", "ovs_interfaceid": "a74bc9cb-9db6-433c-8dd4-31b19f4a26c7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:38:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:38:20.575 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:38:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:38:20.576 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:38:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:38:20.577 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:38:20 compute-0 nova_compute[189265]: 2025-09-30 07:38:20.937 2 DEBUG oslo_concurrency.lockutils [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Releasing lock "refresh_cache-d5cc22f2-bd83-4bac-9ebe-9055fb0761c5" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:38:21 compute-0 nova_compute[189265]: 2025-09-30 07:38:21.478 2 DEBUG oslo_concurrency.lockutils [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:38:21 compute-0 nova_compute[189265]: 2025-09-30 07:38:21.479 2 DEBUG oslo_concurrency.lockutils [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:38:21 compute-0 nova_compute[189265]: 2025-09-30 07:38:21.480 2 DEBUG oslo_concurrency.lockutils [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:38:21 compute-0 nova_compute[189265]: 2025-09-30 07:38:21.485 2 INFO nova.virt.libvirt.driver [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d5cc22f2-bd83-4bac-9ebe-9055fb0761c5] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Sep 30 07:38:21 compute-0 virtqemud[189090]: Domain id=18 name='instance-00000016' uuid=d5cc22f2-bd83-4bac-9ebe-9055fb0761c5 is tainted: custom-monitor
Sep 30 07:38:21 compute-0 podman[221991]: 2025-09-30 07:38:21.50760147 +0000 UTC m=+0.083402962 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20250930, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Sep 30 07:38:22 compute-0 nova_compute[189265]: 2025-09-30 07:38:22.495 2 INFO nova.virt.libvirt.driver [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d5cc22f2-bd83-4bac-9ebe-9055fb0761c5] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Sep 30 07:38:23 compute-0 nova_compute[189265]: 2025-09-30 07:38:23.501 2 INFO nova.virt.libvirt.driver [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d5cc22f2-bd83-4bac-9ebe-9055fb0761c5] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Sep 30 07:38:23 compute-0 nova_compute[189265]: 2025-09-30 07:38:23.505 2 DEBUG nova.compute.manager [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d5cc22f2-bd83-4bac-9ebe-9055fb0761c5] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 07:38:24 compute-0 nova_compute[189265]: 2025-09-30 07:38:24.016 2 DEBUG nova.objects.instance [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d5cc22f2-bd83-4bac-9ebe-9055fb0761c5] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Sep 30 07:38:24 compute-0 nova_compute[189265]: 2025-09-30 07:38:24.113 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:24 compute-0 nova_compute[189265]: 2025-09-30 07:38:24.135 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:25 compute-0 nova_compute[189265]: 2025-09-30 07:38:25.044 2 WARNING neutronclient.v2_0.client [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:38:25 compute-0 nova_compute[189265]: 2025-09-30 07:38:25.252 2 WARNING neutronclient.v2_0.client [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:38:25 compute-0 nova_compute[189265]: 2025-09-30 07:38:25.252 2 WARNING neutronclient.v2_0.client [None req-05dc716b-5a75-497d-b154-7e400e0ff2d6 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:38:25 compute-0 podman[222012]: 2025-09-30 07:38:25.475315526 +0000 UTC m=+0.063196251 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, version=9.6, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, name=ubi9-minimal, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Sep 30 07:38:28 compute-0 nova_compute[189265]: 2025-09-30 07:38:28.363 2 DEBUG oslo_concurrency.lockutils [None req-7b1febe5-9d71-46ee-b481-f4da83ea4d9a 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquiring lock "f7d28008-e1b8-4a29-ad1f-86180635f5f0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:38:28 compute-0 nova_compute[189265]: 2025-09-30 07:38:28.364 2 DEBUG oslo_concurrency.lockutils [None req-7b1febe5-9d71-46ee-b481-f4da83ea4d9a 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "f7d28008-e1b8-4a29-ad1f-86180635f5f0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:38:28 compute-0 nova_compute[189265]: 2025-09-30 07:38:28.364 2 DEBUG oslo_concurrency.lockutils [None req-7b1febe5-9d71-46ee-b481-f4da83ea4d9a 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquiring lock "f7d28008-e1b8-4a29-ad1f-86180635f5f0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:38:28 compute-0 nova_compute[189265]: 2025-09-30 07:38:28.364 2 DEBUG oslo_concurrency.lockutils [None req-7b1febe5-9d71-46ee-b481-f4da83ea4d9a 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "f7d28008-e1b8-4a29-ad1f-86180635f5f0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:38:28 compute-0 nova_compute[189265]: 2025-09-30 07:38:28.365 2 DEBUG oslo_concurrency.lockutils [None req-7b1febe5-9d71-46ee-b481-f4da83ea4d9a 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "f7d28008-e1b8-4a29-ad1f-86180635f5f0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:38:28 compute-0 nova_compute[189265]: 2025-09-30 07:38:28.381 2 INFO nova.compute.manager [None req-7b1febe5-9d71-46ee-b481-f4da83ea4d9a 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: f7d28008-e1b8-4a29-ad1f-86180635f5f0] Terminating instance
Sep 30 07:38:28 compute-0 nova_compute[189265]: 2025-09-30 07:38:28.906 2 DEBUG nova.compute.manager [None req-7b1febe5-9d71-46ee-b481-f4da83ea4d9a 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: f7d28008-e1b8-4a29-ad1f-86180635f5f0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 07:38:28 compute-0 kernel: tap1338fa03-37 (unregistering): left promiscuous mode
Sep 30 07:38:28 compute-0 NetworkManager[51813]: <info>  [1759217908.9322] device (tap1338fa03-37): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 07:38:28 compute-0 nova_compute[189265]: 2025-09-30 07:38:28.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:28 compute-0 ovn_controller[91436]: 2025-09-30T07:38:28Z|00225|binding|INFO|Releasing lport 1338fa03-37bf-4505-8617-f839269e1887 from this chassis (sb_readonly=0)
Sep 30 07:38:28 compute-0 ovn_controller[91436]: 2025-09-30T07:38:28Z|00226|binding|INFO|Setting lport 1338fa03-37bf-4505-8617-f839269e1887 down in Southbound
Sep 30 07:38:28 compute-0 ovn_controller[91436]: 2025-09-30T07:38:28Z|00227|binding|INFO|Removing iface tap1338fa03-37 ovn-installed in OVS
Sep 30 07:38:28 compute-0 nova_compute[189265]: 2025-09-30 07:38:28.949 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:28 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:38:28.956 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:28:0b:d5 10.100.0.14'], port_security=['fa:16:3e:28:0b:d5 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'f7d28008-e1b8-4a29-ad1f-86180635f5f0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c99c822b-3191-49e5-b938-903e25b4a9bb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6431607f3dce4c88bbf6d17ee6cd45b2', 'neutron:revision_number': '15', 'neutron:security_group_ids': '39e9818d-6ede-4a3d-b6e2-a5ad3a4c803a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0bbcb02d-e040-4e0e-9a60-6466c4420133, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], logical_port=1338fa03-37bf-4505-8617-f839269e1887) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:38:28 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:38:28.957 100322 INFO neutron.agent.ovn.metadata.agent [-] Port 1338fa03-37bf-4505-8617-f839269e1887 in datapath c99c822b-3191-49e5-b938-903e25b4a9bb unbound from our chassis
Sep 30 07:38:28 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:38:28.958 100322 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c99c822b-3191-49e5-b938-903e25b4a9bb
Sep 30 07:38:28 compute-0 nova_compute[189265]: 2025-09-30 07:38:28.960 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:28 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:38:28.972 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[d3ddf773-8d3b-4fb1-a5a9-7909a8ac7b5d]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:38:28 compute-0 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000017.scope: Deactivated successfully.
Sep 30 07:38:28 compute-0 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000017.scope: Consumed 4.034s CPU time.
Sep 30 07:38:28 compute-0 systemd-machined[149233]: Machine qemu-17-instance-00000017 terminated.
Sep 30 07:38:29 compute-0 podman[222034]: 2025-09-30 07:38:29.021288635 +0000 UTC m=+0.063470188 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Sep 30 07:38:29 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:38:29.022 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[acb92131-bee8-442d-a90c-838d4a87aca3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:38:29 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:38:29.024 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[10ad7b4c-5c8b-4ef9-ae8f-0a52c41cbdec]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:38:29 compute-0 podman[222037]: 2025-09-30 07:38:29.038895472 +0000 UTC m=+0.075431883 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Sep 30 07:38:29 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:38:29.053 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[3974902b-7326-4e2f-8120-405e1650ff42]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:38:29 compute-0 podman[222038]: 2025-09-30 07:38:29.058565019 +0000 UTC m=+0.095885782 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Sep 30 07:38:29 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:38:29.069 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[0431322b-abcc-45da-8755-797d03e3ecb4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc99c822b-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:67:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 48, 'tx_packets': 7, 'rx_bytes': 2512, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 48, 'tx_packets': 7, 'rx_bytes': 2512, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 54], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 571693, 'reachable_time': 29932, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222105, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:38:29 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:38:29.089 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[20324a56-e91f-4d4e-ab8c-88169c48938a]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapc99c822b-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 571706, 'tstamp': 571706}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222106, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapc99c822b-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 571709, 'tstamp': 571709}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222106, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:38:29 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:38:29.090 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc99c822b-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:38:29 compute-0 nova_compute[189265]: 2025-09-30 07:38:29.130 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:29 compute-0 nova_compute[189265]: 2025-09-30 07:38:29.133 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:29 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:38:29.133 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc99c822b-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:38:29 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:38:29.133 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:38:29 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:38:29.134 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc99c822b-30, col_values=(('external_ids', {'iface-id': '67b7df48-3f38-444a-8506-1c0ec5bd1d15'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:38:29 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:38:29.134 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:38:29 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:38:29.135 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[be502b18-6330-44a8-8374-e0689a55ebb8]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-c99c822b-3191-49e5-b938-903e25b4a9bb\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID c99c822b-3191-49e5-b938-903e25b4a9bb\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:38:29 compute-0 nova_compute[189265]: 2025-09-30 07:38:29.136 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:29 compute-0 nova_compute[189265]: 2025-09-30 07:38:29.189 2 INFO nova.virt.libvirt.driver [-] [instance: f7d28008-e1b8-4a29-ad1f-86180635f5f0] Instance destroyed successfully.
Sep 30 07:38:29 compute-0 nova_compute[189265]: 2025-09-30 07:38:29.190 2 DEBUG nova.objects.instance [None req-7b1febe5-9d71-46ee-b481-f4da83ea4d9a 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lazy-loading 'resources' on Instance uuid f7d28008-e1b8-4a29-ad1f-86180635f5f0 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:38:29 compute-0 nova_compute[189265]: 2025-09-30 07:38:29.705 2 DEBUG nova.virt.libvirt.vif [None req-7b1febe5-9d71-46ee-b481-f4da83ea4d9a 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-09-30T07:36:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-291732864',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-291732864',id=23,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T07:36:53Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6431607f3dce4c88bbf6d17ee6cd45b2',ramdisk_id='',reservation_id='r-18wxf30x',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',clean_attempts='1',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1096120513',owner_user_name='tempest-TestExecuteStrategies-1096120513-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T07:37:54Z,user_data=None,user_id='89ba5d19014145188ad2a3c812acdc88',uuid=f7d28008-e1b8-4a29-ad1f-86180635f5f0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1338fa03-37bf-4505-8617-f839269e1887", "address": "fa:16:3e:28:0b:d5", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1338fa03-37", "ovs_interfaceid": "1338fa03-37bf-4505-8617-f839269e1887", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 07:38:29 compute-0 nova_compute[189265]: 2025-09-30 07:38:29.706 2 DEBUG nova.network.os_vif_util [None req-7b1febe5-9d71-46ee-b481-f4da83ea4d9a 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Converting VIF {"id": "1338fa03-37bf-4505-8617-f839269e1887", "address": "fa:16:3e:28:0b:d5", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1338fa03-37", "ovs_interfaceid": "1338fa03-37bf-4505-8617-f839269e1887", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:38:29 compute-0 nova_compute[189265]: 2025-09-30 07:38:29.707 2 DEBUG nova.network.os_vif_util [None req-7b1febe5-9d71-46ee-b481-f4da83ea4d9a 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:28:0b:d5,bridge_name='br-int',has_traffic_filtering=True,id=1338fa03-37bf-4505-8617-f839269e1887,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1338fa03-37') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:38:29 compute-0 nova_compute[189265]: 2025-09-30 07:38:29.708 2 DEBUG os_vif [None req-7b1febe5-9d71-46ee-b481-f4da83ea4d9a 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:28:0b:d5,bridge_name='br-int',has_traffic_filtering=True,id=1338fa03-37bf-4505-8617-f839269e1887,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1338fa03-37') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 07:38:29 compute-0 nova_compute[189265]: 2025-09-30 07:38:29.710 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:29 compute-0 nova_compute[189265]: 2025-09-30 07:38:29.710 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1338fa03-37, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:38:29 compute-0 nova_compute[189265]: 2025-09-30 07:38:29.712 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:29 compute-0 nova_compute[189265]: 2025-09-30 07:38:29.714 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:38:29 compute-0 nova_compute[189265]: 2025-09-30 07:38:29.716 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:29 compute-0 nova_compute[189265]: 2025-09-30 07:38:29.716 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:29 compute-0 nova_compute[189265]: 2025-09-30 07:38:29.717 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=902ed515-5380-417d-8cbc-8201ca1f2c71) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:38:29 compute-0 nova_compute[189265]: 2025-09-30 07:38:29.718 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:29 compute-0 nova_compute[189265]: 2025-09-30 07:38:29.719 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:29 compute-0 nova_compute[189265]: 2025-09-30 07:38:29.721 2 INFO os_vif [None req-7b1febe5-9d71-46ee-b481-f4da83ea4d9a 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:28:0b:d5,bridge_name='br-int',has_traffic_filtering=True,id=1338fa03-37bf-4505-8617-f839269e1887,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1338fa03-37')
Sep 30 07:38:29 compute-0 nova_compute[189265]: 2025-09-30 07:38:29.722 2 INFO nova.virt.libvirt.driver [None req-7b1febe5-9d71-46ee-b481-f4da83ea4d9a 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: f7d28008-e1b8-4a29-ad1f-86180635f5f0] Deleting instance files /var/lib/nova/instances/f7d28008-e1b8-4a29-ad1f-86180635f5f0_del
Sep 30 07:38:29 compute-0 nova_compute[189265]: 2025-09-30 07:38:29.722 2 INFO nova.virt.libvirt.driver [None req-7b1febe5-9d71-46ee-b481-f4da83ea4d9a 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: f7d28008-e1b8-4a29-ad1f-86180635f5f0] Deletion of /var/lib/nova/instances/f7d28008-e1b8-4a29-ad1f-86180635f5f0_del complete
Sep 30 07:38:29 compute-0 podman[199733]: time="2025-09-30T07:38:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:38:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:38:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 07:38:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:38:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3479 "" "Go-http-client/1.1"
Sep 30 07:38:30 compute-0 nova_compute[189265]: 2025-09-30 07:38:30.068 2 DEBUG nova.compute.manager [req-8468d4cf-b569-47aa-99dd-b964bd1e1f85 req-b00fe132-cde1-4dc2-927d-0c9d22b529f3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: f7d28008-e1b8-4a29-ad1f-86180635f5f0] Received event network-vif-unplugged-1338fa03-37bf-4505-8617-f839269e1887 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:38:30 compute-0 nova_compute[189265]: 2025-09-30 07:38:30.069 2 DEBUG oslo_concurrency.lockutils [req-8468d4cf-b569-47aa-99dd-b964bd1e1f85 req-b00fe132-cde1-4dc2-927d-0c9d22b529f3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "f7d28008-e1b8-4a29-ad1f-86180635f5f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:38:30 compute-0 nova_compute[189265]: 2025-09-30 07:38:30.070 2 DEBUG oslo_concurrency.lockutils [req-8468d4cf-b569-47aa-99dd-b964bd1e1f85 req-b00fe132-cde1-4dc2-927d-0c9d22b529f3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "f7d28008-e1b8-4a29-ad1f-86180635f5f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:38:30 compute-0 nova_compute[189265]: 2025-09-30 07:38:30.070 2 DEBUG oslo_concurrency.lockutils [req-8468d4cf-b569-47aa-99dd-b964bd1e1f85 req-b00fe132-cde1-4dc2-927d-0c9d22b529f3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "f7d28008-e1b8-4a29-ad1f-86180635f5f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:38:30 compute-0 nova_compute[189265]: 2025-09-30 07:38:30.071 2 DEBUG nova.compute.manager [req-8468d4cf-b569-47aa-99dd-b964bd1e1f85 req-b00fe132-cde1-4dc2-927d-0c9d22b529f3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: f7d28008-e1b8-4a29-ad1f-86180635f5f0] No waiting events found dispatching network-vif-unplugged-1338fa03-37bf-4505-8617-f839269e1887 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:38:30 compute-0 nova_compute[189265]: 2025-09-30 07:38:30.071 2 DEBUG nova.compute.manager [req-8468d4cf-b569-47aa-99dd-b964bd1e1f85 req-b00fe132-cde1-4dc2-927d-0c9d22b529f3 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: f7d28008-e1b8-4a29-ad1f-86180635f5f0] Received event network-vif-unplugged-1338fa03-37bf-4505-8617-f839269e1887 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:38:30 compute-0 nova_compute[189265]: 2025-09-30 07:38:30.251 2 INFO nova.compute.manager [None req-7b1febe5-9d71-46ee-b481-f4da83ea4d9a 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: f7d28008-e1b8-4a29-ad1f-86180635f5f0] Took 1.34 seconds to destroy the instance on the hypervisor.
Sep 30 07:38:30 compute-0 nova_compute[189265]: 2025-09-30 07:38:30.252 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-7b1febe5-9d71-46ee-b481-f4da83ea4d9a 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 07:38:30 compute-0 nova_compute[189265]: 2025-09-30 07:38:30.252 2 DEBUG nova.compute.manager [-] [instance: f7d28008-e1b8-4a29-ad1f-86180635f5f0] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 07:38:30 compute-0 nova_compute[189265]: 2025-09-30 07:38:30.252 2 DEBUG nova.network.neutron [-] [instance: f7d28008-e1b8-4a29-ad1f-86180635f5f0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 07:38:30 compute-0 nova_compute[189265]: 2025-09-30 07:38:30.253 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:38:30 compute-0 nova_compute[189265]: 2025-09-30 07:38:30.609 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:38:31 compute-0 openstack_network_exporter[201859]: ERROR   07:38:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:38:31 compute-0 openstack_network_exporter[201859]: ERROR   07:38:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:38:31 compute-0 openstack_network_exporter[201859]: ERROR   07:38:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:38:31 compute-0 openstack_network_exporter[201859]: ERROR   07:38:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:38:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:38:31 compute-0 openstack_network_exporter[201859]: ERROR   07:38:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:38:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:38:31 compute-0 nova_compute[189265]: 2025-09-30 07:38:31.457 2 DEBUG nova.network.neutron [-] [instance: f7d28008-e1b8-4a29-ad1f-86180635f5f0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:38:31 compute-0 nova_compute[189265]: 2025-09-30 07:38:31.976 2 INFO nova.compute.manager [-] [instance: f7d28008-e1b8-4a29-ad1f-86180635f5f0] Took 1.72 seconds to deallocate network for instance.
Sep 30 07:38:32 compute-0 nova_compute[189265]: 2025-09-30 07:38:32.153 2 DEBUG nova.compute.manager [req-1018064d-b73a-4b4e-9f35-30ee5f7ee9dc req-8fd1621a-a4d9-4dc9-bb6a-a87ce0ff0aa2 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: f7d28008-e1b8-4a29-ad1f-86180635f5f0] Received event network-vif-unplugged-1338fa03-37bf-4505-8617-f839269e1887 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:38:32 compute-0 nova_compute[189265]: 2025-09-30 07:38:32.153 2 DEBUG oslo_concurrency.lockutils [req-1018064d-b73a-4b4e-9f35-30ee5f7ee9dc req-8fd1621a-a4d9-4dc9-bb6a-a87ce0ff0aa2 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "f7d28008-e1b8-4a29-ad1f-86180635f5f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:38:32 compute-0 nova_compute[189265]: 2025-09-30 07:38:32.154 2 DEBUG oslo_concurrency.lockutils [req-1018064d-b73a-4b4e-9f35-30ee5f7ee9dc req-8fd1621a-a4d9-4dc9-bb6a-a87ce0ff0aa2 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "f7d28008-e1b8-4a29-ad1f-86180635f5f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:38:32 compute-0 nova_compute[189265]: 2025-09-30 07:38:32.154 2 DEBUG oslo_concurrency.lockutils [req-1018064d-b73a-4b4e-9f35-30ee5f7ee9dc req-8fd1621a-a4d9-4dc9-bb6a-a87ce0ff0aa2 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "f7d28008-e1b8-4a29-ad1f-86180635f5f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:38:32 compute-0 nova_compute[189265]: 2025-09-30 07:38:32.155 2 DEBUG nova.compute.manager [req-1018064d-b73a-4b4e-9f35-30ee5f7ee9dc req-8fd1621a-a4d9-4dc9-bb6a-a87ce0ff0aa2 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: f7d28008-e1b8-4a29-ad1f-86180635f5f0] No waiting events found dispatching network-vif-unplugged-1338fa03-37bf-4505-8617-f839269e1887 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:38:32 compute-0 nova_compute[189265]: 2025-09-30 07:38:32.155 2 WARNING nova.compute.manager [req-1018064d-b73a-4b4e-9f35-30ee5f7ee9dc req-8fd1621a-a4d9-4dc9-bb6a-a87ce0ff0aa2 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: f7d28008-e1b8-4a29-ad1f-86180635f5f0] Received unexpected event network-vif-unplugged-1338fa03-37bf-4505-8617-f839269e1887 for instance with vm_state deleted and task_state None.
Sep 30 07:38:32 compute-0 nova_compute[189265]: 2025-09-30 07:38:32.155 2 DEBUG nova.compute.manager [req-1018064d-b73a-4b4e-9f35-30ee5f7ee9dc req-8fd1621a-a4d9-4dc9-bb6a-a87ce0ff0aa2 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: f7d28008-e1b8-4a29-ad1f-86180635f5f0] Received event network-vif-deleted-1338fa03-37bf-4505-8617-f839269e1887 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:38:32 compute-0 nova_compute[189265]: 2025-09-30 07:38:32.500 2 DEBUG oslo_concurrency.lockutils [None req-7b1febe5-9d71-46ee-b481-f4da83ea4d9a 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:38:32 compute-0 nova_compute[189265]: 2025-09-30 07:38:32.501 2 DEBUG oslo_concurrency.lockutils [None req-7b1febe5-9d71-46ee-b481-f4da83ea4d9a 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:38:32 compute-0 nova_compute[189265]: 2025-09-30 07:38:32.579 2 DEBUG nova.compute.provider_tree [None req-7b1febe5-9d71-46ee-b481-f4da83ea4d9a 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:38:33 compute-0 nova_compute[189265]: 2025-09-30 07:38:33.105 2 DEBUG nova.scheduler.client.report [None req-7b1febe5-9d71-46ee-b481-f4da83ea4d9a 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:38:33 compute-0 nova_compute[189265]: 2025-09-30 07:38:33.616 2 DEBUG oslo_concurrency.lockutils [None req-7b1febe5-9d71-46ee-b481-f4da83ea4d9a 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.115s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:38:33 compute-0 nova_compute[189265]: 2025-09-30 07:38:33.663 2 INFO nova.scheduler.client.report [None req-7b1febe5-9d71-46ee-b481-f4da83ea4d9a 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Deleted allocations for instance f7d28008-e1b8-4a29-ad1f-86180635f5f0
Sep 30 07:38:34 compute-0 nova_compute[189265]: 2025-09-30 07:38:34.171 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:34 compute-0 nova_compute[189265]: 2025-09-30 07:38:34.718 2 DEBUG oslo_concurrency.lockutils [None req-7b1febe5-9d71-46ee-b481-f4da83ea4d9a 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "f7d28008-e1b8-4a29-ad1f-86180635f5f0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.354s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:38:34 compute-0 nova_compute[189265]: 2025-09-30 07:38:34.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:35 compute-0 nova_compute[189265]: 2025-09-30 07:38:35.460 2 DEBUG oslo_concurrency.lockutils [None req-93ed0f6a-663f-426a-a12c-ff1f70075358 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquiring lock "d5cc22f2-bd83-4bac-9ebe-9055fb0761c5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:38:35 compute-0 nova_compute[189265]: 2025-09-30 07:38:35.461 2 DEBUG oslo_concurrency.lockutils [None req-93ed0f6a-663f-426a-a12c-ff1f70075358 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "d5cc22f2-bd83-4bac-9ebe-9055fb0761c5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:38:35 compute-0 nova_compute[189265]: 2025-09-30 07:38:35.462 2 DEBUG oslo_concurrency.lockutils [None req-93ed0f6a-663f-426a-a12c-ff1f70075358 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquiring lock "d5cc22f2-bd83-4bac-9ebe-9055fb0761c5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:38:35 compute-0 nova_compute[189265]: 2025-09-30 07:38:35.462 2 DEBUG oslo_concurrency.lockutils [None req-93ed0f6a-663f-426a-a12c-ff1f70075358 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "d5cc22f2-bd83-4bac-9ebe-9055fb0761c5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:38:35 compute-0 nova_compute[189265]: 2025-09-30 07:38:35.462 2 DEBUG oslo_concurrency.lockutils [None req-93ed0f6a-663f-426a-a12c-ff1f70075358 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "d5cc22f2-bd83-4bac-9ebe-9055fb0761c5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:38:35 compute-0 nova_compute[189265]: 2025-09-30 07:38:35.480 2 INFO nova.compute.manager [None req-93ed0f6a-663f-426a-a12c-ff1f70075358 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: d5cc22f2-bd83-4bac-9ebe-9055fb0761c5] Terminating instance
Sep 30 07:38:36 compute-0 nova_compute[189265]: 2025-09-30 07:38:35.999 2 DEBUG nova.compute.manager [None req-93ed0f6a-663f-426a-a12c-ff1f70075358 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: d5cc22f2-bd83-4bac-9ebe-9055fb0761c5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 07:38:36 compute-0 kernel: tapa74bc9cb-9d (unregistering): left promiscuous mode
Sep 30 07:38:36 compute-0 NetworkManager[51813]: <info>  [1759217916.0302] device (tapa74bc9cb-9d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 07:38:36 compute-0 ovn_controller[91436]: 2025-09-30T07:38:36Z|00228|binding|INFO|Releasing lport a74bc9cb-9db6-433c-8dd4-31b19f4a26c7 from this chassis (sb_readonly=0)
Sep 30 07:38:36 compute-0 ovn_controller[91436]: 2025-09-30T07:38:36Z|00229|binding|INFO|Setting lport a74bc9cb-9db6-433c-8dd4-31b19f4a26c7 down in Southbound
Sep 30 07:38:36 compute-0 ovn_controller[91436]: 2025-09-30T07:38:36Z|00230|binding|INFO|Removing iface tapa74bc9cb-9d ovn-installed in OVS
Sep 30 07:38:36 compute-0 nova_compute[189265]: 2025-09-30 07:38:36.041 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:36 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:38:36.068 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1c:98:63 10.100.0.9'], port_security=['fa:16:3e:1c:98:63 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'd5cc22f2-bd83-4bac-9ebe-9055fb0761c5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c99c822b-3191-49e5-b938-903e25b4a9bb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6431607f3dce4c88bbf6d17ee6cd45b2', 'neutron:revision_number': '15', 'neutron:security_group_ids': '39e9818d-6ede-4a3d-b6e2-a5ad3a4c803a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0bbcb02d-e040-4e0e-9a60-6466c4420133, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], logical_port=a74bc9cb-9db6-433c-8dd4-31b19f4a26c7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:38:36 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:38:36.070 100322 INFO neutron.agent.ovn.metadata.agent [-] Port a74bc9cb-9db6-433c-8dd4-31b19f4a26c7 in datapath c99c822b-3191-49e5-b938-903e25b4a9bb unbound from our chassis
Sep 30 07:38:36 compute-0 nova_compute[189265]: 2025-09-30 07:38:36.071 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:36 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:38:36.073 100322 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c99c822b-3191-49e5-b938-903e25b4a9bb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 07:38:36 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:38:36.074 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[033af0c8-19a7-4e4a-80aa-706dc716513c]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:38:36 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:38:36.075 100322 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb namespace which is not needed anymore
Sep 30 07:38:36 compute-0 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000016.scope: Deactivated successfully.
Sep 30 07:38:36 compute-0 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000016.scope: Consumed 2.955s CPU time.
Sep 30 07:38:36 compute-0 systemd-machined[149233]: Machine qemu-18-instance-00000016 terminated.
Sep 30 07:38:36 compute-0 neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb[221722]: [NOTICE]   (221727) : haproxy version is 3.0.5-8e879a5
Sep 30 07:38:36 compute-0 neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb[221722]: [NOTICE]   (221727) : path to executable is /usr/sbin/haproxy
Sep 30 07:38:36 compute-0 neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb[221722]: [WARNING]  (221727) : Exiting Master process...
Sep 30 07:38:36 compute-0 podman[222150]: 2025-09-30 07:38:36.243516306 +0000 UTC m=+0.056976561 container kill 90d5c23a0b523c0ca9eb1130892b65951d84d894b337556ec22aafa770c60772 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 07:38:36 compute-0 neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb[221722]: [ALERT]    (221727) : Current worker (221729) exited with code 143 (Terminated)
Sep 30 07:38:36 compute-0 neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb[221722]: [WARNING]  (221727) : All workers exited. Exiting... (0)
Sep 30 07:38:36 compute-0 systemd[1]: libpod-90d5c23a0b523c0ca9eb1130892b65951d84d894b337556ec22aafa770c60772.scope: Deactivated successfully.
Sep 30 07:38:36 compute-0 nova_compute[189265]: 2025-09-30 07:38:36.275 2 INFO nova.virt.libvirt.driver [-] [instance: d5cc22f2-bd83-4bac-9ebe-9055fb0761c5] Instance destroyed successfully.
Sep 30 07:38:36 compute-0 nova_compute[189265]: 2025-09-30 07:38:36.276 2 DEBUG nova.objects.instance [None req-93ed0f6a-663f-426a-a12c-ff1f70075358 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lazy-loading 'resources' on Instance uuid d5cc22f2-bd83-4bac-9ebe-9055fb0761c5 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:38:36 compute-0 podman[222176]: 2025-09-30 07:38:36.305708448 +0000 UTC m=+0.041959000 container died 90d5c23a0b523c0ca9eb1130892b65951d84d894b337556ec22aafa770c60772 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 07:38:36 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-90d5c23a0b523c0ca9eb1130892b65951d84d894b337556ec22aafa770c60772-userdata-shm.mount: Deactivated successfully.
Sep 30 07:38:36 compute-0 systemd[1]: var-lib-containers-storage-overlay-0fea9f5cfc6dbb2268ece17d8dbae9151c092b63da2c7b5cbc7e3c1307080c1a-merged.mount: Deactivated successfully.
Sep 30 07:38:36 compute-0 podman[222176]: 2025-09-30 07:38:36.347907743 +0000 UTC m=+0.084158255 container cleanup 90d5c23a0b523c0ca9eb1130892b65951d84d894b337556ec22aafa770c60772 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Sep 30 07:38:36 compute-0 systemd[1]: libpod-conmon-90d5c23a0b523c0ca9eb1130892b65951d84d894b337556ec22aafa770c60772.scope: Deactivated successfully.
Sep 30 07:38:36 compute-0 podman[222186]: 2025-09-30 07:38:36.365985664 +0000 UTC m=+0.080174360 container remove 90d5c23a0b523c0ca9eb1130892b65951d84d894b337556ec22aafa770c60772 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Sep 30 07:38:36 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:38:36.370 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[8d33f971-b963-429e-8874-38860fcb8ae0]: (4, ("Tue Sep 30 07:38:36 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb (90d5c23a0b523c0ca9eb1130892b65951d84d894b337556ec22aafa770c60772)\n90d5c23a0b523c0ca9eb1130892b65951d84d894b337556ec22aafa770c60772\nTue Sep 30 07:38:36 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb (90d5c23a0b523c0ca9eb1130892b65951d84d894b337556ec22aafa770c60772)\n90d5c23a0b523c0ca9eb1130892b65951d84d894b337556ec22aafa770c60772\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:38:36 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:38:36.372 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[b63c6420-c71e-423b-a935-ce5d4355cbd9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:38:36 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:38:36.373 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:38:36 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:38:36.373 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[4304b1d9-b15a-4fe0-a10b-f68478f593af]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:38:36 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:38:36.374 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc99c822b-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:38:36 compute-0 nova_compute[189265]: 2025-09-30 07:38:36.376 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:36 compute-0 kernel: tapc99c822b-30: left promiscuous mode
Sep 30 07:38:36 compute-0 nova_compute[189265]: 2025-09-30 07:38:36.392 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:36 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:38:36.398 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[e117c28f-f51c-42b5-adab-863a290f2f36]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:38:36 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:38:36.420 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[b3498cc7-fe5c-4b45-a9c2-13e99c82cab9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:38:36 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:38:36.421 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[7adb3821-0cbd-400d-b3c4-508b9505322b]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:38:36 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:38:36.435 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[a4ff1264-5e3c-4b5f-bb83-4d85f2fe7626]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 571685, 'reachable_time': 42057, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222216, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:38:36 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:38:36.437 100440 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Sep 30 07:38:36 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:38:36.438 100440 DEBUG oslo.privsep.daemon [-] privsep: reply[1e50e5a8-87dd-4c4c-90f2-c6460411df2b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:38:36 compute-0 systemd[1]: run-netns-ovnmeta\x2dc99c822b\x2d3191\x2d49e5\x2db938\x2d903e25b4a9bb.mount: Deactivated successfully.
Sep 30 07:38:36 compute-0 nova_compute[189265]: 2025-09-30 07:38:36.790 2 DEBUG nova.virt.libvirt.vif [None req-93ed0f6a-663f-426a-a12c-ff1f70075358 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-09-30T07:36:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-1166559026',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-1166559026',id=22,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T07:36:30Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6431607f3dce4c88bbf6d17ee6cd45b2',ramdisk_id='',reservation_id='r-tm712ta8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',clean_attempts='1',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1096120513',owner_user_name='tempest-TestExecuteStrategies-1096120513-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T07:38:24Z,user_data=None,user_id='89ba5d19014145188ad2a3c812acdc88',uuid=d5cc22f2-bd83-4bac-9ebe-9055fb0761c5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a74bc9cb-9db6-433c-8dd4-31b19f4a26c7", "address": "fa:16:3e:1c:98:63", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa74bc9cb-9d", "ovs_interfaceid": "a74bc9cb-9db6-433c-8dd4-31b19f4a26c7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 07:38:36 compute-0 nova_compute[189265]: 2025-09-30 07:38:36.790 2 DEBUG nova.network.os_vif_util [None req-93ed0f6a-663f-426a-a12c-ff1f70075358 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Converting VIF {"id": "a74bc9cb-9db6-433c-8dd4-31b19f4a26c7", "address": "fa:16:3e:1c:98:63", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa74bc9cb-9d", "ovs_interfaceid": "a74bc9cb-9db6-433c-8dd4-31b19f4a26c7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:38:36 compute-0 nova_compute[189265]: 2025-09-30 07:38:36.790 2 DEBUG nova.network.os_vif_util [None req-93ed0f6a-663f-426a-a12c-ff1f70075358 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1c:98:63,bridge_name='br-int',has_traffic_filtering=True,id=a74bc9cb-9db6-433c-8dd4-31b19f4a26c7,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa74bc9cb-9d') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:38:36 compute-0 nova_compute[189265]: 2025-09-30 07:38:36.791 2 DEBUG os_vif [None req-93ed0f6a-663f-426a-a12c-ff1f70075358 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1c:98:63,bridge_name='br-int',has_traffic_filtering=True,id=a74bc9cb-9db6-433c-8dd4-31b19f4a26c7,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa74bc9cb-9d') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 07:38:36 compute-0 nova_compute[189265]: 2025-09-30 07:38:36.792 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:36 compute-0 nova_compute[189265]: 2025-09-30 07:38:36.792 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa74bc9cb-9d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:38:36 compute-0 nova_compute[189265]: 2025-09-30 07:38:36.793 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:36 compute-0 nova_compute[189265]: 2025-09-30 07:38:36.794 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:36 compute-0 nova_compute[189265]: 2025-09-30 07:38:36.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:38:36 compute-0 nova_compute[189265]: 2025-09-30 07:38:36.796 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:36 compute-0 nova_compute[189265]: 2025-09-30 07:38:36.796 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:36 compute-0 nova_compute[189265]: 2025-09-30 07:38:36.797 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=2917a55b-d5e3-4e3e-9b71-e08e6f916e0c) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:38:36 compute-0 nova_compute[189265]: 2025-09-30 07:38:36.797 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:36 compute-0 nova_compute[189265]: 2025-09-30 07:38:36.799 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:36 compute-0 nova_compute[189265]: 2025-09-30 07:38:36.800 2 INFO os_vif [None req-93ed0f6a-663f-426a-a12c-ff1f70075358 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1c:98:63,bridge_name='br-int',has_traffic_filtering=True,id=a74bc9cb-9db6-433c-8dd4-31b19f4a26c7,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa74bc9cb-9d')
Sep 30 07:38:36 compute-0 nova_compute[189265]: 2025-09-30 07:38:36.801 2 INFO nova.virt.libvirt.driver [None req-93ed0f6a-663f-426a-a12c-ff1f70075358 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: d5cc22f2-bd83-4bac-9ebe-9055fb0761c5] Deleting instance files /var/lib/nova/instances/d5cc22f2-bd83-4bac-9ebe-9055fb0761c5_del
Sep 30 07:38:36 compute-0 nova_compute[189265]: 2025-09-30 07:38:36.801 2 INFO nova.virt.libvirt.driver [None req-93ed0f6a-663f-426a-a12c-ff1f70075358 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: d5cc22f2-bd83-4bac-9ebe-9055fb0761c5] Deletion of /var/lib/nova/instances/d5cc22f2-bd83-4bac-9ebe-9055fb0761c5_del complete
Sep 30 07:38:37 compute-0 nova_compute[189265]: 2025-09-30 07:38:37.010 2 DEBUG nova.compute.manager [req-87782a5a-62bc-4c07-80ef-3d8515c32546 req-06fe4966-33ff-4757-8d3b-8fac85bb4cd2 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d5cc22f2-bd83-4bac-9ebe-9055fb0761c5] Received event network-vif-unplugged-a74bc9cb-9db6-433c-8dd4-31b19f4a26c7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:38:37 compute-0 nova_compute[189265]: 2025-09-30 07:38:37.011 2 DEBUG oslo_concurrency.lockutils [req-87782a5a-62bc-4c07-80ef-3d8515c32546 req-06fe4966-33ff-4757-8d3b-8fac85bb4cd2 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "d5cc22f2-bd83-4bac-9ebe-9055fb0761c5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:38:37 compute-0 nova_compute[189265]: 2025-09-30 07:38:37.011 2 DEBUG oslo_concurrency.lockutils [req-87782a5a-62bc-4c07-80ef-3d8515c32546 req-06fe4966-33ff-4757-8d3b-8fac85bb4cd2 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "d5cc22f2-bd83-4bac-9ebe-9055fb0761c5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:38:37 compute-0 nova_compute[189265]: 2025-09-30 07:38:37.011 2 DEBUG oslo_concurrency.lockutils [req-87782a5a-62bc-4c07-80ef-3d8515c32546 req-06fe4966-33ff-4757-8d3b-8fac85bb4cd2 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "d5cc22f2-bd83-4bac-9ebe-9055fb0761c5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:38:37 compute-0 nova_compute[189265]: 2025-09-30 07:38:37.011 2 DEBUG nova.compute.manager [req-87782a5a-62bc-4c07-80ef-3d8515c32546 req-06fe4966-33ff-4757-8d3b-8fac85bb4cd2 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d5cc22f2-bd83-4bac-9ebe-9055fb0761c5] No waiting events found dispatching network-vif-unplugged-a74bc9cb-9db6-433c-8dd4-31b19f4a26c7 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:38:37 compute-0 nova_compute[189265]: 2025-09-30 07:38:37.012 2 DEBUG nova.compute.manager [req-87782a5a-62bc-4c07-80ef-3d8515c32546 req-06fe4966-33ff-4757-8d3b-8fac85bb4cd2 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d5cc22f2-bd83-4bac-9ebe-9055fb0761c5] Received event network-vif-unplugged-a74bc9cb-9db6-433c-8dd4-31b19f4a26c7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:38:37 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:38:37.071 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '1a:26:7c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2e:60:fa:91:d0:34'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:38:37 compute-0 nova_compute[189265]: 2025-09-30 07:38:37.072 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:37 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:38:37.073 100322 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 07:38:37 compute-0 nova_compute[189265]: 2025-09-30 07:38:37.317 2 INFO nova.compute.manager [None req-93ed0f6a-663f-426a-a12c-ff1f70075358 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: d5cc22f2-bd83-4bac-9ebe-9055fb0761c5] Took 1.32 seconds to destroy the instance on the hypervisor.
Sep 30 07:38:37 compute-0 nova_compute[189265]: 2025-09-30 07:38:37.317 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-93ed0f6a-663f-426a-a12c-ff1f70075358 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 07:38:37 compute-0 nova_compute[189265]: 2025-09-30 07:38:37.318 2 DEBUG nova.compute.manager [-] [instance: d5cc22f2-bd83-4bac-9ebe-9055fb0761c5] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 07:38:37 compute-0 nova_compute[189265]: 2025-09-30 07:38:37.318 2 DEBUG nova.network.neutron [-] [instance: d5cc22f2-bd83-4bac-9ebe-9055fb0761c5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 07:38:37 compute-0 nova_compute[189265]: 2025-09-30 07:38:37.318 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:38:37 compute-0 nova_compute[189265]: 2025-09-30 07:38:37.540 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:38:38 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:38:38.074 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=01429670-4ea1-4dab-babc-4bc628cc01bb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:38:38 compute-0 nova_compute[189265]: 2025-09-30 07:38:38.430 2 DEBUG nova.network.neutron [-] [instance: d5cc22f2-bd83-4bac-9ebe-9055fb0761c5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:38:38 compute-0 nova_compute[189265]: 2025-09-30 07:38:38.950 2 INFO nova.compute.manager [-] [instance: d5cc22f2-bd83-4bac-9ebe-9055fb0761c5] Took 1.63 seconds to deallocate network for instance.
Sep 30 07:38:39 compute-0 nova_compute[189265]: 2025-09-30 07:38:39.086 2 DEBUG nova.compute.manager [req-5bba6fa1-0cc1-4339-a893-13eb4fa88c83 req-d490772f-ae4b-4872-b4dc-78457adf828e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d5cc22f2-bd83-4bac-9ebe-9055fb0761c5] Received event network-vif-unplugged-a74bc9cb-9db6-433c-8dd4-31b19f4a26c7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:38:39 compute-0 nova_compute[189265]: 2025-09-30 07:38:39.087 2 DEBUG oslo_concurrency.lockutils [req-5bba6fa1-0cc1-4339-a893-13eb4fa88c83 req-d490772f-ae4b-4872-b4dc-78457adf828e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "d5cc22f2-bd83-4bac-9ebe-9055fb0761c5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:38:39 compute-0 nova_compute[189265]: 2025-09-30 07:38:39.087 2 DEBUG oslo_concurrency.lockutils [req-5bba6fa1-0cc1-4339-a893-13eb4fa88c83 req-d490772f-ae4b-4872-b4dc-78457adf828e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "d5cc22f2-bd83-4bac-9ebe-9055fb0761c5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:38:39 compute-0 nova_compute[189265]: 2025-09-30 07:38:39.088 2 DEBUG oslo_concurrency.lockutils [req-5bba6fa1-0cc1-4339-a893-13eb4fa88c83 req-d490772f-ae4b-4872-b4dc-78457adf828e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "d5cc22f2-bd83-4bac-9ebe-9055fb0761c5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:38:39 compute-0 nova_compute[189265]: 2025-09-30 07:38:39.088 2 DEBUG nova.compute.manager [req-5bba6fa1-0cc1-4339-a893-13eb4fa88c83 req-d490772f-ae4b-4872-b4dc-78457adf828e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d5cc22f2-bd83-4bac-9ebe-9055fb0761c5] No waiting events found dispatching network-vif-unplugged-a74bc9cb-9db6-433c-8dd4-31b19f4a26c7 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:38:39 compute-0 nova_compute[189265]: 2025-09-30 07:38:39.089 2 WARNING nova.compute.manager [req-5bba6fa1-0cc1-4339-a893-13eb4fa88c83 req-d490772f-ae4b-4872-b4dc-78457adf828e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d5cc22f2-bd83-4bac-9ebe-9055fb0761c5] Received unexpected event network-vif-unplugged-a74bc9cb-9db6-433c-8dd4-31b19f4a26c7 for instance with vm_state deleted and task_state None.
Sep 30 07:38:39 compute-0 nova_compute[189265]: 2025-09-30 07:38:39.089 2 DEBUG nova.compute.manager [req-5bba6fa1-0cc1-4339-a893-13eb4fa88c83 req-d490772f-ae4b-4872-b4dc-78457adf828e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: d5cc22f2-bd83-4bac-9ebe-9055fb0761c5] Received event network-vif-deleted-a74bc9cb-9db6-433c-8dd4-31b19f4a26c7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:38:39 compute-0 nova_compute[189265]: 2025-09-30 07:38:39.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:39 compute-0 nova_compute[189265]: 2025-09-30 07:38:39.475 2 DEBUG oslo_concurrency.lockutils [None req-93ed0f6a-663f-426a-a12c-ff1f70075358 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:38:39 compute-0 nova_compute[189265]: 2025-09-30 07:38:39.475 2 DEBUG oslo_concurrency.lockutils [None req-93ed0f6a-663f-426a-a12c-ff1f70075358 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:38:39 compute-0 nova_compute[189265]: 2025-09-30 07:38:39.480 2 DEBUG oslo_concurrency.lockutils [None req-93ed0f6a-663f-426a-a12c-ff1f70075358 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.005s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:38:39 compute-0 nova_compute[189265]: 2025-09-30 07:38:39.518 2 INFO nova.scheduler.client.report [None req-93ed0f6a-663f-426a-a12c-ff1f70075358 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Deleted allocations for instance d5cc22f2-bd83-4bac-9ebe-9055fb0761c5
Sep 30 07:38:40 compute-0 nova_compute[189265]: 2025-09-30 07:38:40.548 2 DEBUG oslo_concurrency.lockutils [None req-93ed0f6a-663f-426a-a12c-ff1f70075358 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "d5cc22f2-bd83-4bac-9ebe-9055fb0761c5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.087s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:38:41 compute-0 nova_compute[189265]: 2025-09-30 07:38:41.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:43 compute-0 podman[222218]: 2025-09-30 07:38:43.509526931 +0000 UTC m=+0.086005648 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 07:38:44 compute-0 nova_compute[189265]: 2025-09-30 07:38:44.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:46 compute-0 nova_compute[189265]: 2025-09-30 07:38:46.800 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:48 compute-0 nova_compute[189265]: 2025-09-30 07:38:48.309 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:38:48 compute-0 nova_compute[189265]: 2025-09-30 07:38:48.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:38:49 compute-0 nova_compute[189265]: 2025-09-30 07:38:49.244 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:49 compute-0 nova_compute[189265]: 2025-09-30 07:38:49.799 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:38:49 compute-0 nova_compute[189265]: 2025-09-30 07:38:49.805 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:38:51 compute-0 nova_compute[189265]: 2025-09-30 07:38:51.796 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:38:51 compute-0 nova_compute[189265]: 2025-09-30 07:38:51.828 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:52 compute-0 podman[222243]: 2025-09-30 07:38:52.49630998 +0000 UTC m=+0.078027118 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250930, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team)
Sep 30 07:38:54 compute-0 nova_compute[189265]: 2025-09-30 07:38:54.286 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:55 compute-0 nova_compute[189265]: 2025-09-30 07:38:55.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:38:56 compute-0 podman[222263]: 2025-09-30 07:38:56.488609993 +0000 UTC m=+0.073015463 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, vcs-type=git, distribution-scope=public, io.buildah.version=1.33.7, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., release=1755695350, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Sep 30 07:38:56 compute-0 nova_compute[189265]: 2025-09-30 07:38:56.829 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:58 compute-0 nova_compute[189265]: 2025-09-30 07:38:58.301 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:38:58 compute-0 nova_compute[189265]: 2025-09-30 07:38:58.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:38:59 compute-0 nova_compute[189265]: 2025-09-30 07:38:59.288 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:38:59 compute-0 nova_compute[189265]: 2025-09-30 07:38:59.304 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:38:59 compute-0 nova_compute[189265]: 2025-09-30 07:38:59.305 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:38:59 compute-0 nova_compute[189265]: 2025-09-30 07:38:59.305 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:38:59 compute-0 nova_compute[189265]: 2025-09-30 07:38:59.305 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:38:59 compute-0 podman[222286]: 2025-09-30 07:38:59.426301505 +0000 UTC m=+0.075130264 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Sep 30 07:38:59 compute-0 podman[222287]: 2025-09-30 07:38:59.440286058 +0000 UTC m=+0.093126763 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Sep 30 07:38:59 compute-0 podman[222288]: 2025-09-30 07:38:59.467558503 +0000 UTC m=+0.118045150 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Sep 30 07:38:59 compute-0 nova_compute[189265]: 2025-09-30 07:38:59.480 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:38:59 compute-0 nova_compute[189265]: 2025-09-30 07:38:59.482 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:38:59 compute-0 nova_compute[189265]: 2025-09-30 07:38:59.506 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.024s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:38:59 compute-0 nova_compute[189265]: 2025-09-30 07:38:59.506 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5849MB free_disk=73.30373764038086GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:38:59 compute-0 nova_compute[189265]: 2025-09-30 07:38:59.507 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:38:59 compute-0 nova_compute[189265]: 2025-09-30 07:38:59.507 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:38:59 compute-0 podman[199733]: time="2025-09-30T07:38:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:38:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:38:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:38:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:38:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3012 "" "Go-http-client/1.1"
Sep 30 07:39:00 compute-0 nova_compute[189265]: 2025-09-30 07:39:00.560 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:39:00 compute-0 nova_compute[189265]: 2025-09-30 07:39:00.561 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:38:59 up  1:36,  0 user,  load average: 0.36, 0.32, 0.30\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:39:00 compute-0 nova_compute[189265]: 2025-09-30 07:39:00.587 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:39:01 compute-0 nova_compute[189265]: 2025-09-30 07:39:01.096 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:39:01 compute-0 openstack_network_exporter[201859]: ERROR   07:39:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:39:01 compute-0 openstack_network_exporter[201859]: ERROR   07:39:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:39:01 compute-0 openstack_network_exporter[201859]: ERROR   07:39:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:39:01 compute-0 openstack_network_exporter[201859]: ERROR   07:39:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:39:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:39:01 compute-0 openstack_network_exporter[201859]: ERROR   07:39:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:39:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:39:01 compute-0 nova_compute[189265]: 2025-09-30 07:39:01.630 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:39:01 compute-0 nova_compute[189265]: 2025-09-30 07:39:01.630 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.123s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:39:01 compute-0 nova_compute[189265]: 2025-09-30 07:39:01.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:39:02 compute-0 nova_compute[189265]: 2025-09-30 07:39:02.630 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:39:04 compute-0 nova_compute[189265]: 2025-09-30 07:39:04.289 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:39:06 compute-0 nova_compute[189265]: 2025-09-30 07:39:06.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:39:06 compute-0 nova_compute[189265]: 2025-09-30 07:39:06.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:39:06 compute-0 nova_compute[189265]: 2025-09-30 07:39:06.788 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Sep 30 07:39:06 compute-0 nova_compute[189265]: 2025-09-30 07:39:06.834 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:39:09 compute-0 nova_compute[189265]: 2025-09-30 07:39:09.292 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:39:11 compute-0 nova_compute[189265]: 2025-09-30 07:39:11.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:39:14 compute-0 nova_compute[189265]: 2025-09-30 07:39:14.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:39:14 compute-0 podman[222348]: 2025-09-30 07:39:14.5158976 +0000 UTC m=+0.081995262 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 07:39:16 compute-0 nova_compute[189265]: 2025-09-30 07:39:16.913 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:39:18 compute-0 nova_compute[189265]: 2025-09-30 07:39:18.302 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:39:18 compute-0 nova_compute[189265]: 2025-09-30 07:39:18.302 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Sep 30 07:39:18 compute-0 nova_compute[189265]: 2025-09-30 07:39:18.813 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Sep 30 07:39:19 compute-0 nova_compute[189265]: 2025-09-30 07:39:19.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:39:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:39:20.578 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:39:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:39:20.579 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:39:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:39:20.579 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:39:22 compute-0 nova_compute[189265]: 2025-09-30 07:39:22.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:39:23 compute-0 podman[222373]: 2025-09-30 07:39:23.476147984 +0000 UTC m=+0.064559040 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 07:39:24 compute-0 nova_compute[189265]: 2025-09-30 07:39:24.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:39:27 compute-0 nova_compute[189265]: 2025-09-30 07:39:27.095 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:39:27 compute-0 podman[222393]: 2025-09-30 07:39:27.497174925 +0000 UTC m=+0.072125178 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, architecture=x86_64, io.openshift.expose-services=, maintainer=Red Hat, Inc., config_id=edpm, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, vendor=Red Hat, Inc., name=ubi9-minimal, distribution-scope=public, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Sep 30 07:39:29 compute-0 nova_compute[189265]: 2025-09-30 07:39:29.299 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:39:29 compute-0 podman[199733]: time="2025-09-30T07:39:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:39:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:39:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:39:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:39:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3014 "" "Go-http-client/1.1"
Sep 30 07:39:30 compute-0 podman[222415]: 2025-09-30 07:39:30.518289269 +0000 UTC m=+0.080031206 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 07:39:30 compute-0 podman[222414]: 2025-09-30 07:39:30.536407631 +0000 UTC m=+0.106731955 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=multipathd, tcib_build_tag=watcher_latest)
Sep 30 07:39:30 compute-0 podman[222416]: 2025-09-30 07:39:30.55131431 +0000 UTC m=+0.113211092 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 07:39:31 compute-0 openstack_network_exporter[201859]: ERROR   07:39:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:39:31 compute-0 openstack_network_exporter[201859]: ERROR   07:39:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:39:31 compute-0 openstack_network_exporter[201859]: ERROR   07:39:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:39:31 compute-0 openstack_network_exporter[201859]: ERROR   07:39:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:39:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:39:31 compute-0 openstack_network_exporter[201859]: ERROR   07:39:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:39:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:39:32 compute-0 nova_compute[189265]: 2025-09-30 07:39:32.135 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:39:34 compute-0 nova_compute[189265]: 2025-09-30 07:39:34.299 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:39:37 compute-0 nova_compute[189265]: 2025-09-30 07:39:37.186 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:39:39 compute-0 nova_compute[189265]: 2025-09-30 07:39:39.301 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:39:42 compute-0 nova_compute[189265]: 2025-09-30 07:39:42.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:39:44 compute-0 nova_compute[189265]: 2025-09-30 07:39:44.303 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:39:45 compute-0 podman[222476]: 2025-09-30 07:39:45.459347555 +0000 UTC m=+0.047092237 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 07:39:47 compute-0 nova_compute[189265]: 2025-09-30 07:39:47.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:39:49 compute-0 nova_compute[189265]: 2025-09-30 07:39:49.296 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:39:49 compute-0 nova_compute[189265]: 2025-09-30 07:39:49.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:39:50 compute-0 nova_compute[189265]: 2025-09-30 07:39:50.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:39:51 compute-0 nova_compute[189265]: 2025-09-30 07:39:51.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:39:51 compute-0 nova_compute[189265]: 2025-09-30 07:39:51.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:39:51 compute-0 nova_compute[189265]: 2025-09-30 07:39:51.788 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:39:52 compute-0 nova_compute[189265]: 2025-09-30 07:39:52.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:39:53 compute-0 unix_chkpwd[222502]: password check failed for user (root)
Sep 30 07:39:53 compute-0 sshd-session[222500]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.20  user=root
Sep 30 07:39:54 compute-0 nova_compute[189265]: 2025-09-30 07:39:54.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:39:54 compute-0 podman[222503]: 2025-09-30 07:39:54.486963831 +0000 UTC m=+0.069626225 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Sep 30 07:39:55 compute-0 sshd-session[222500]: Failed password for root from 193.46.255.20 port 42644 ssh2
Sep 30 07:39:56 compute-0 ovn_controller[91436]: 2025-09-30T07:39:56Z|00231|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Sep 30 07:39:57 compute-0 unix_chkpwd[222523]: password check failed for user (root)
Sep 30 07:39:57 compute-0 nova_compute[189265]: 2025-09-30 07:39:57.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:39:57 compute-0 nova_compute[189265]: 2025-09-30 07:39:57.504 2 DEBUG nova.virt.libvirt.driver [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: dd9afa46-ab32-4a8e-861d-d825051c267a] Creating tmpfile /var/lib/nova/instances/tmpnrr_wgku to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Sep 30 07:39:57 compute-0 nova_compute[189265]: 2025-09-30 07:39:57.505 2 WARNING neutronclient.v2_0.client [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:39:57 compute-0 nova_compute[189265]: 2025-09-30 07:39:57.507 2 DEBUG nova.virt.libvirt.driver [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c5599a24-d8b6-491b-a582-14ff5b98bd5d] Creating tmpfile /var/lib/nova/instances/tmpgtz_zmb3 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Sep 30 07:39:57 compute-0 nova_compute[189265]: 2025-09-30 07:39:57.507 2 WARNING neutronclient.v2_0.client [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:39:57 compute-0 nova_compute[189265]: 2025-09-30 07:39:57.511 2 DEBUG nova.compute.manager [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpnrr_wgku',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Sep 30 07:39:57 compute-0 nova_compute[189265]: 2025-09-30 07:39:57.515 2 DEBUG nova.compute.manager [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpgtz_zmb3',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Sep 30 07:39:58 compute-0 podman[222525]: 2025-09-30 07:39:58.507533909 +0000 UTC m=+0.082645051 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, release=1755695350)
Sep 30 07:39:58 compute-0 sshd-session[222500]: Failed password for root from 193.46.255.20 port 42644 ssh2
Sep 30 07:39:59 compute-0 nova_compute[189265]: 2025-09-30 07:39:59.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:39:59 compute-0 unix_chkpwd[222546]: password check failed for user (root)
Sep 30 07:39:59 compute-0 nova_compute[189265]: 2025-09-30 07:39:59.575 2 WARNING neutronclient.v2_0.client [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:39:59 compute-0 nova_compute[189265]: 2025-09-30 07:39:59.578 2 WARNING neutronclient.v2_0.client [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:39:59 compute-0 podman[199733]: time="2025-09-30T07:39:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:39:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:39:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:39:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:39:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3017 "" "Go-http-client/1.1"
Sep 30 07:39:59 compute-0 nova_compute[189265]: 2025-09-30 07:39:59.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:39:59 compute-0 nova_compute[189265]: 2025-09-30 07:39:59.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:40:00 compute-0 nova_compute[189265]: 2025-09-30 07:40:00.307 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:40:00 compute-0 nova_compute[189265]: 2025-09-30 07:40:00.308 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:40:00 compute-0 nova_compute[189265]: 2025-09-30 07:40:00.308 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:40:00 compute-0 nova_compute[189265]: 2025-09-30 07:40:00.308 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:40:00 compute-0 nova_compute[189265]: 2025-09-30 07:40:00.513 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:40:00 compute-0 nova_compute[189265]: 2025-09-30 07:40:00.514 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:40:00 compute-0 nova_compute[189265]: 2025-09-30 07:40:00.552 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.038s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:40:00 compute-0 nova_compute[189265]: 2025-09-30 07:40:00.553 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5861MB free_disk=73.30373764038086GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:40:00 compute-0 nova_compute[189265]: 2025-09-30 07:40:00.554 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:40:00 compute-0 nova_compute[189265]: 2025-09-30 07:40:00.554 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:40:01 compute-0 anacron[168842]: Job `cron.weekly' started
Sep 30 07:40:01 compute-0 anacron[168842]: Job `cron.weekly' terminated
Sep 30 07:40:01 compute-0 openstack_network_exporter[201859]: ERROR   07:40:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:40:01 compute-0 openstack_network_exporter[201859]: ERROR   07:40:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:40:01 compute-0 openstack_network_exporter[201859]: ERROR   07:40:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:40:01 compute-0 openstack_network_exporter[201859]: ERROR   07:40:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:40:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:40:01 compute-0 openstack_network_exporter[201859]: ERROR   07:40:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:40:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:40:01 compute-0 podman[222551]: 2025-09-30 07:40:01.499465424 +0000 UTC m=+0.077035890 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent)
Sep 30 07:40:01 compute-0 podman[222550]: 2025-09-30 07:40:01.511164561 +0000 UTC m=+0.084215377 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 07:40:01 compute-0 podman[222552]: 2025-09-30 07:40:01.561335456 +0000 UTC m=+0.132745504 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Sep 30 07:40:01 compute-0 sshd-session[222500]: Failed password for root from 193.46.255.20 port 42644 ssh2
Sep 30 07:40:02 compute-0 nova_compute[189265]: 2025-09-30 07:40:02.168 2 WARNING nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Instance dd9afa46-ab32-4a8e-861d-d825051c267a has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Sep 30 07:40:02 compute-0 nova_compute[189265]: 2025-09-30 07:40:02.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:02 compute-0 nova_compute[189265]: 2025-09-30 07:40:02.676 2 WARNING nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Instance c5599a24-d8b6-491b-a582-14ff5b98bd5d has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.
Sep 30 07:40:02 compute-0 nova_compute[189265]: 2025-09-30 07:40:02.676 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:40:02 compute-0 nova_compute[189265]: 2025-09-30 07:40:02.677 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:40:00 up  1:37,  0 user,  load average: 0.74, 0.39, 0.32\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:40:02 compute-0 rsyslogd[1007]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Sep 30 07:40:02 compute-0 nova_compute[189265]: 2025-09-30 07:40:02.728 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Refreshing inventories for resource provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Sep 30 07:40:02 compute-0 nova_compute[189265]: 2025-09-30 07:40:02.745 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Updating ProviderTree inventory for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Sep 30 07:40:02 compute-0 nova_compute[189265]: 2025-09-30 07:40:02.746 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Updating inventory in ProviderTree for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Sep 30 07:40:02 compute-0 nova_compute[189265]: 2025-09-30 07:40:02.801 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Refreshing aggregate associations for resource provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Sep 30 07:40:02 compute-0 nova_compute[189265]: 2025-09-30 07:40:02.833 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Refreshing trait associations for resource provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc, traits: COMPUTE_SECURITY_TPM_CRB,HW_ARCH_X86_64,HW_CPU_X86_F16C,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AESNI,COMPUTE_STORAGE_VIRTIO_FS,HW_CPU_X86_FMA3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE,COMPUTE_ACCELERATORS,HW_CPU_X86_SVM,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_EXTEND,COMPUTE_ARCH_X86_64,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SHA,HW_CPU_X86_BMI,COMPUTE_SOUND_MODEL_USB,COMPUTE_SOUND_MODEL_SB16,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AMD_SVM,HW_CPU_X86_BMI2,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_TIS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_AVX,COMPUTE_SOUND_MODEL_AC97,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_ABM,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_VIF_MODEL_IGB,HW_CPU_X86_SSE41,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SOUND_MODEL_ICH6,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_MMX,HW_CPU_X86_SSE4A,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_PCSPK,HW_CPU_X86_CLMUL _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Sep 30 07:40:02 compute-0 nova_compute[189265]: 2025-09-30 07:40:02.896 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:40:03 compute-0 nova_compute[189265]: 2025-09-30 07:40:03.404 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:40:03 compute-0 sshd-session[222500]: Received disconnect from 193.46.255.20 port 42644:11:  [preauth]
Sep 30 07:40:03 compute-0 sshd-session[222500]: Disconnected from authenticating user root 193.46.255.20 port 42644 [preauth]
Sep 30 07:40:03 compute-0 sshd-session[222500]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.20  user=root
Sep 30 07:40:03 compute-0 nova_compute[189265]: 2025-09-30 07:40:03.917 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:40:03 compute-0 nova_compute[189265]: 2025-09-30 07:40:03.918 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.364s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:40:04 compute-0 nova_compute[189265]: 2025-09-30 07:40:04.310 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:04 compute-0 unix_chkpwd[222613]: password check failed for user (root)
Sep 30 07:40:04 compute-0 sshd-session[222611]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.20  user=root
Sep 30 07:40:04 compute-0 nova_compute[189265]: 2025-09-30 07:40:04.760 2 DEBUG nova.compute.manager [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpgtz_zmb3',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c5599a24-d8b6-491b-a582-14ff5b98bd5d',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Sep 30 07:40:04 compute-0 nova_compute[189265]: 2025-09-30 07:40:04.918 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:40:05 compute-0 nova_compute[189265]: 2025-09-30 07:40:05.429 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:40:05 compute-0 nova_compute[189265]: 2025-09-30 07:40:05.774 2 DEBUG oslo_concurrency.lockutils [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "refresh_cache-c5599a24-d8b6-491b-a582-14ff5b98bd5d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:40:05 compute-0 nova_compute[189265]: 2025-09-30 07:40:05.774 2 DEBUG oslo_concurrency.lockutils [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquired lock "refresh_cache-c5599a24-d8b6-491b-a582-14ff5b98bd5d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:40:05 compute-0 nova_compute[189265]: 2025-09-30 07:40:05.775 2 DEBUG nova.network.neutron [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c5599a24-d8b6-491b-a582-14ff5b98bd5d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 07:40:06 compute-0 nova_compute[189265]: 2025-09-30 07:40:06.284 2 WARNING neutronclient.v2_0.client [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:40:06 compute-0 sshd-session[222611]: Failed password for root from 193.46.255.20 port 17062 ssh2
Sep 30 07:40:06 compute-0 unix_chkpwd[222614]: password check failed for user (root)
Sep 30 07:40:06 compute-0 nova_compute[189265]: 2025-09-30 07:40:06.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:40:07 compute-0 nova_compute[189265]: 2025-09-30 07:40:07.289 2 WARNING neutronclient.v2_0.client [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:40:07 compute-0 nova_compute[189265]: 2025-09-30 07:40:07.391 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:07 compute-0 nova_compute[189265]: 2025-09-30 07:40:07.572 2 DEBUG nova.network.neutron [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c5599a24-d8b6-491b-a582-14ff5b98bd5d] Updating instance_info_cache with network_info: [{"id": "25a2d902-e837-49df-b614-07f054d068db", "address": "fa:16:3e:43:0a:c9", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25a2d902-e8", "ovs_interfaceid": "25a2d902-e837-49df-b614-07f054d068db", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:40:08 compute-0 nova_compute[189265]: 2025-09-30 07:40:08.079 2 DEBUG oslo_concurrency.lockutils [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Releasing lock "refresh_cache-c5599a24-d8b6-491b-a582-14ff5b98bd5d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:40:08 compute-0 nova_compute[189265]: 2025-09-30 07:40:08.093 2 DEBUG nova.virt.libvirt.driver [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c5599a24-d8b6-491b-a582-14ff5b98bd5d] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpgtz_zmb3',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c5599a24-d8b6-491b-a582-14ff5b98bd5d',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Sep 30 07:40:08 compute-0 nova_compute[189265]: 2025-09-30 07:40:08.094 2 DEBUG nova.virt.libvirt.driver [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c5599a24-d8b6-491b-a582-14ff5b98bd5d] Creating instance directory: /var/lib/nova/instances/c5599a24-d8b6-491b-a582-14ff5b98bd5d pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Sep 30 07:40:08 compute-0 nova_compute[189265]: 2025-09-30 07:40:08.094 2 DEBUG nova.virt.libvirt.driver [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c5599a24-d8b6-491b-a582-14ff5b98bd5d] Creating disk.info with the contents: {'/var/lib/nova/instances/c5599a24-d8b6-491b-a582-14ff5b98bd5d/disk': 'qcow2', '/var/lib/nova/instances/c5599a24-d8b6-491b-a582-14ff5b98bd5d/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Sep 30 07:40:08 compute-0 nova_compute[189265]: 2025-09-30 07:40:08.095 2 DEBUG nova.virt.libvirt.driver [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c5599a24-d8b6-491b-a582-14ff5b98bd5d] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Sep 30 07:40:08 compute-0 nova_compute[189265]: 2025-09-30 07:40:08.096 2 DEBUG nova.objects.instance [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lazy-loading 'trusted_certs' on Instance uuid c5599a24-d8b6-491b-a582-14ff5b98bd5d obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:40:08 compute-0 nova_compute[189265]: 2025-09-30 07:40:08.603 2 DEBUG oslo_utils.imageutils.format_inspector [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:40:08 compute-0 nova_compute[189265]: 2025-09-30 07:40:08.606 2 DEBUG oslo_utils.imageutils.format_inspector [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:40:08 compute-0 nova_compute[189265]: 2025-09-30 07:40:08.607 2 DEBUG oslo_concurrency.processutils [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:40:08 compute-0 sshd-session[222611]: Failed password for root from 193.46.255.20 port 17062 ssh2
Sep 30 07:40:08 compute-0 nova_compute[189265]: 2025-09-30 07:40:08.693 2 DEBUG oslo_concurrency.processutils [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:40:08 compute-0 nova_compute[189265]: 2025-09-30 07:40:08.693 2 DEBUG oslo_concurrency.lockutils [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "649c128805005f3dfb5a93843c58a367cdfe939d" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:40:08 compute-0 nova_compute[189265]: 2025-09-30 07:40:08.694 2 DEBUG oslo_concurrency.lockutils [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "649c128805005f3dfb5a93843c58a367cdfe939d" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:40:08 compute-0 nova_compute[189265]: 2025-09-30 07:40:08.694 2 DEBUG oslo_utils.imageutils.format_inspector [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:40:08 compute-0 nova_compute[189265]: 2025-09-30 07:40:08.697 2 DEBUG oslo_utils.imageutils.format_inspector [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:40:08 compute-0 nova_compute[189265]: 2025-09-30 07:40:08.697 2 DEBUG oslo_concurrency.processutils [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:40:08 compute-0 nova_compute[189265]: 2025-09-30 07:40:08.743 2 DEBUG oslo_concurrency.processutils [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.045s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:40:08 compute-0 nova_compute[189265]: 2025-09-30 07:40:08.744 2 DEBUG oslo_concurrency.processutils [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d,backing_fmt=raw /var/lib/nova/instances/c5599a24-d8b6-491b-a582-14ff5b98bd5d/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:40:08 compute-0 nova_compute[189265]: 2025-09-30 07:40:08.777 2 DEBUG oslo_concurrency.processutils [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d,backing_fmt=raw /var/lib/nova/instances/c5599a24-d8b6-491b-a582-14ff5b98bd5d/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:40:08 compute-0 nova_compute[189265]: 2025-09-30 07:40:08.777 2 DEBUG oslo_concurrency.lockutils [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "649c128805005f3dfb5a93843c58a367cdfe939d" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.083s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:40:08 compute-0 nova_compute[189265]: 2025-09-30 07:40:08.778 2 DEBUG oslo_concurrency.processutils [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:40:08 compute-0 nova_compute[189265]: 2025-09-30 07:40:08.839 2 DEBUG oslo_concurrency.processutils [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:40:08 compute-0 nova_compute[189265]: 2025-09-30 07:40:08.840 2 DEBUG nova.virt.disk.api [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Checking if we can resize image /var/lib/nova/instances/c5599a24-d8b6-491b-a582-14ff5b98bd5d/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Sep 30 07:40:08 compute-0 nova_compute[189265]: 2025-09-30 07:40:08.841 2 DEBUG oslo_concurrency.processutils [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c5599a24-d8b6-491b-a582-14ff5b98bd5d/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:40:08 compute-0 nova_compute[189265]: 2025-09-30 07:40:08.920 2 DEBUG oslo_concurrency.processutils [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c5599a24-d8b6-491b-a582-14ff5b98bd5d/disk --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:40:08 compute-0 nova_compute[189265]: 2025-09-30 07:40:08.921 2 DEBUG nova.virt.disk.api [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Cannot resize image /var/lib/nova/instances/c5599a24-d8b6-491b-a582-14ff5b98bd5d/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Sep 30 07:40:08 compute-0 nova_compute[189265]: 2025-09-30 07:40:08.922 2 DEBUG nova.objects.instance [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lazy-loading 'migration_context' on Instance uuid c5599a24-d8b6-491b-a582-14ff5b98bd5d obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:40:09 compute-0 nova_compute[189265]: 2025-09-30 07:40:09.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:09 compute-0 nova_compute[189265]: 2025-09-30 07:40:09.435 2 DEBUG nova.objects.base [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Object Instance<c5599a24-d8b6-491b-a582-14ff5b98bd5d> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Sep 30 07:40:09 compute-0 nova_compute[189265]: 2025-09-30 07:40:09.435 2 DEBUG oslo_concurrency.processutils [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/c5599a24-d8b6-491b-a582-14ff5b98bd5d/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:40:09 compute-0 nova_compute[189265]: 2025-09-30 07:40:09.460 2 DEBUG oslo_concurrency.processutils [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/c5599a24-d8b6-491b-a582-14ff5b98bd5d/disk.config 497664" returned: 0 in 0.025s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:40:09 compute-0 nova_compute[189265]: 2025-09-30 07:40:09.464 2 DEBUG nova.virt.libvirt.driver [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c5599a24-d8b6-491b-a582-14ff5b98bd5d] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Sep 30 07:40:09 compute-0 nova_compute[189265]: 2025-09-30 07:40:09.466 2 DEBUG nova.virt.libvirt.vif [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T07:39:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-848120778',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-848120778',id=25,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T07:39:28Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6431607f3dce4c88bbf6d17ee6cd45b2',ramdisk_id='',reservation_id='r-r5fp6y5j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1096120513',owner_user_name='tempest-TestExecuteStrategies-1096120513-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T07:39:28Z,user_data=None,user_id='89ba5d19014145188ad2a3c812acdc88',uuid=c5599a24-d8b6-491b-a582-14ff5b98bd5d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "25a2d902-e837-49df-b614-07f054d068db", "address": "fa:16:3e:43:0a:c9", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap25a2d902-e8", "ovs_interfaceid": "25a2d902-e837-49df-b614-07f054d068db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 07:40:09 compute-0 nova_compute[189265]: 2025-09-30 07:40:09.467 2 DEBUG nova.network.os_vif_util [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Converting VIF {"id": "25a2d902-e837-49df-b614-07f054d068db", "address": "fa:16:3e:43:0a:c9", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap25a2d902-e8", "ovs_interfaceid": "25a2d902-e837-49df-b614-07f054d068db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:40:09 compute-0 nova_compute[189265]: 2025-09-30 07:40:09.468 2 DEBUG nova.network.os_vif_util [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:43:0a:c9,bridge_name='br-int',has_traffic_filtering=True,id=25a2d902-e837-49df-b614-07f054d068db,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25a2d902-e8') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:40:09 compute-0 nova_compute[189265]: 2025-09-30 07:40:09.469 2 DEBUG os_vif [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:43:0a:c9,bridge_name='br-int',has_traffic_filtering=True,id=25a2d902-e837-49df-b614-07f054d068db,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25a2d902-e8') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 07:40:09 compute-0 nova_compute[189265]: 2025-09-30 07:40:09.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:09 compute-0 nova_compute[189265]: 2025-09-30 07:40:09.471 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:40:09 compute-0 nova_compute[189265]: 2025-09-30 07:40:09.472 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:40:09 compute-0 nova_compute[189265]: 2025-09-30 07:40:09.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:09 compute-0 nova_compute[189265]: 2025-09-30 07:40:09.474 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '653a4b93-e266-56a8-9f7f-341c8a01a89e', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:40:09 compute-0 nova_compute[189265]: 2025-09-30 07:40:09.521 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:09 compute-0 nova_compute[189265]: 2025-09-30 07:40:09.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:09 compute-0 nova_compute[189265]: 2025-09-30 07:40:09.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:09 compute-0 nova_compute[189265]: 2025-09-30 07:40:09.526 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap25a2d902-e8, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:40:09 compute-0 nova_compute[189265]: 2025-09-30 07:40:09.526 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap25a2d902-e8, col_values=(('qos', UUID('fcae76ac-e549-43b3-8778-f33176e586e5')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:40:09 compute-0 nova_compute[189265]: 2025-09-30 07:40:09.527 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap25a2d902-e8, col_values=(('external_ids', {'iface-id': '25a2d902-e837-49df-b614-07f054d068db', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:43:0a:c9', 'vm-uuid': 'c5599a24-d8b6-491b-a582-14ff5b98bd5d'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:40:09 compute-0 NetworkManager[51813]: <info>  [1759218009.5286] manager: (tap25a2d902-e8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/81)
Sep 30 07:40:09 compute-0 nova_compute[189265]: 2025-09-30 07:40:09.528 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:09 compute-0 nova_compute[189265]: 2025-09-30 07:40:09.532 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:40:09 compute-0 nova_compute[189265]: 2025-09-30 07:40:09.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:09 compute-0 nova_compute[189265]: 2025-09-30 07:40:09.538 2 INFO os_vif [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:43:0a:c9,bridge_name='br-int',has_traffic_filtering=True,id=25a2d902-e837-49df-b614-07f054d068db,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25a2d902-e8')
Sep 30 07:40:09 compute-0 nova_compute[189265]: 2025-09-30 07:40:09.538 2 DEBUG nova.virt.libvirt.driver [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Sep 30 07:40:09 compute-0 nova_compute[189265]: 2025-09-30 07:40:09.539 2 DEBUG nova.compute.manager [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpgtz_zmb3',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c5599a24-d8b6-491b-a582-14ff5b98bd5d',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Sep 30 07:40:09 compute-0 nova_compute[189265]: 2025-09-30 07:40:09.540 2 WARNING neutronclient.v2_0.client [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:40:09 compute-0 nova_compute[189265]: 2025-09-30 07:40:09.958 2 WARNING neutronclient.v2_0.client [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:40:10 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:10.234 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '1a:26:7c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2e:60:fa:91:d0:34'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:40:10 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:10.235 100322 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 07:40:10 compute-0 nova_compute[189265]: 2025-09-30 07:40:10.235 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:10 compute-0 sshd-session[222636]: Connection closed by 113.90.141.147 port 36552
Sep 30 07:40:10 compute-0 unix_chkpwd[222637]: password check failed for user (root)
Sep 30 07:40:11 compute-0 nova_compute[189265]: 2025-09-30 07:40:11.008 2 DEBUG nova.network.neutron [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c5599a24-d8b6-491b-a582-14ff5b98bd5d] Port 25a2d902-e837-49df-b614-07f054d068db updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Sep 30 07:40:11 compute-0 nova_compute[189265]: 2025-09-30 07:40:11.027 2 DEBUG nova.compute.manager [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpgtz_zmb3',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='c5599a24-d8b6-491b-a582-14ff5b98bd5d',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Sep 30 07:40:13 compute-0 sshd-session[222611]: Failed password for root from 193.46.255.20 port 17062 ssh2
Sep 30 07:40:14 compute-0 kernel: tap25a2d902-e8: entered promiscuous mode
Sep 30 07:40:14 compute-0 ovn_controller[91436]: 2025-09-30T07:40:14Z|00232|binding|INFO|Claiming lport 25a2d902-e837-49df-b614-07f054d068db for this additional chassis.
Sep 30 07:40:14 compute-0 ovn_controller[91436]: 2025-09-30T07:40:14Z|00233|binding|INFO|25a2d902-e837-49df-b614-07f054d068db: Claiming fa:16:3e:43:0a:c9 10.100.0.10
Sep 30 07:40:14 compute-0 NetworkManager[51813]: <info>  [1759218014.2644] manager: (tap25a2d902-e8): new Tun device (/org/freedesktop/NetworkManager/Devices/82)
Sep 30 07:40:14 compute-0 nova_compute[189265]: 2025-09-30 07:40:14.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:14.272 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:43:0a:c9 10.100.0.10'], port_security=['fa:16:3e:43:0a:c9 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'c5599a24-d8b6-491b-a582-14ff5b98bd5d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c99c822b-3191-49e5-b938-903e25b4a9bb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6431607f3dce4c88bbf6d17ee6cd45b2', 'neutron:revision_number': '10', 'neutron:security_group_ids': '39e9818d-6ede-4a3d-b6e2-a5ad3a4c803a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0bbcb02d-e040-4e0e-9a60-6466c4420133, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[], logical_port=25a2d902-e837-49df-b614-07f054d068db) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:14.274 100322 INFO neutron.agent.ovn.metadata.agent [-] Port 25a2d902-e837-49df-b614-07f054d068db in datapath c99c822b-3191-49e5-b938-903e25b4a9bb unbound from our chassis
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:14.276 100322 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c99c822b-3191-49e5-b938-903e25b4a9bb
Sep 30 07:40:14 compute-0 nova_compute[189265]: 2025-09-30 07:40:14.278 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:14 compute-0 ovn_controller[91436]: 2025-09-30T07:40:14Z|00234|binding|INFO|Setting lport 25a2d902-e837-49df-b614-07f054d068db ovn-installed in OVS
Sep 30 07:40:14 compute-0 nova_compute[189265]: 2025-09-30 07:40:14.280 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:14.296 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[15a24509-c96e-4b8c-99ee-0640d5e58c7f]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:14.297 100322 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc99c822b-31 in ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:14.300 210650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc99c822b-30 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:14.300 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[8ada0567-a316-436b-82b1-cdffbedf5f09]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:14.301 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[aafd79c2-cb7e-409b-8654-7041653c5cb6]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:14.319 100440 DEBUG oslo.privsep.daemon [-] privsep: reply[7af39a3d-2147-4598-8ebe-3a73519597b2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:40:14 compute-0 nova_compute[189265]: 2025-09-30 07:40:14.320 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:14 compute-0 systemd-machined[149233]: New machine qemu-19-instance-00000019.
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:14.340 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[8ceeee23-1487-43d0-8b9b-84862e34086a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:40:14 compute-0 systemd[1]: Started Virtual Machine qemu-19-instance-00000019.
Sep 30 07:40:14 compute-0 systemd-udevd[222656]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 07:40:14 compute-0 NetworkManager[51813]: <info>  [1759218014.3654] device (tap25a2d902-e8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 07:40:14 compute-0 NetworkManager[51813]: <info>  [1759218014.3670] device (tap25a2d902-e8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:14.385 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[399a51ec-869f-434b-bd15-c76cd22d1d6d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:14.393 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[e9186a63-0c37-4470-80dc-049b96506d6d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:40:14 compute-0 NetworkManager[51813]: <info>  [1759218014.3957] manager: (tapc99c822b-30): new Veth device (/org/freedesktop/NetworkManager/Devices/83)
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:14.432 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[04242f32-b96e-478b-a0d5-e37805cdf1e4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:14.435 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[ab0cd74e-c43c-4e25-8090-ecd31c9f70ca]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:40:14 compute-0 NetworkManager[51813]: <info>  [1759218014.4634] device (tapc99c822b-30): carrier: link connected
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:14.473 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[4c9ad602-0051-4ac0-9717-feb9e48e1b61]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:14.494 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[433e9f1c-ffd0-411e-8305-78be5103ce33]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc99c822b-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:67:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 59], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 587218, 'reachable_time': 24680, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222686, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:14.509 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[7e47abc3-670d-4f44-9b6f-3abe6ee4c6ef]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe09:678c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 587218, 'tstamp': 587218}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222687, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:14.527 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[a5099905-a4cd-43a6-a1b4-716cbeb69df5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc99c822b-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:67:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 59], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 587218, 'reachable_time': 24680, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222688, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:40:14 compute-0 nova_compute[189265]: 2025-09-30 07:40:14.528 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:14.566 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[b4aea219-1197-4363-ae15-51bce6040194]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:14.645 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[c3c989a9-52d1-4955-b7de-16e42c095133]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:14.647 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc99c822b-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:14.647 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:14.648 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc99c822b-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:40:14 compute-0 NetworkManager[51813]: <info>  [1759218014.6924] manager: (tapc99c822b-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/84)
Sep 30 07:40:14 compute-0 kernel: tapc99c822b-30: entered promiscuous mode
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:14.696 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc99c822b-30, col_values=(('external_ids', {'iface-id': '67b7df48-3f38-444a-8506-1c0ec5bd1d15'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:40:14 compute-0 nova_compute[189265]: 2025-09-30 07:40:14.697 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:14 compute-0 ovn_controller[91436]: 2025-09-30T07:40:14Z|00235|binding|INFO|Releasing lport 67b7df48-3f38-444a-8506-1c0ec5bd1d15 from this chassis (sb_readonly=0)
Sep 30 07:40:14 compute-0 nova_compute[189265]: 2025-09-30 07:40:14.713 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:14 compute-0 nova_compute[189265]: 2025-09-30 07:40:14.714 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:14.717 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[7a59a86c-016c-44fb-881f-cf079ab5a91e]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:14.717 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:14.718 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:14.718 100322 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for c99c822b-3191-49e5-b938-903e25b4a9bb disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:14.718 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:14.719 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[d58c1697-5fba-4efb-8269-92543528cc26]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:14.719 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:14.720 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[09c42537-5041-402d-9b5f-830f163e0cdd]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:14.720 100322 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]: global
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]:     log         /dev/log local0 debug
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]:     log-tag     haproxy-metadata-proxy-c99c822b-3191-49e5-b938-903e25b4a9bb
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]:     user        root
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]:     group       root
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]:     maxconn     1024
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]:     pidfile     /var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]:     daemon
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]: 
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]: defaults
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]:     log global
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]:     mode http
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]:     option httplog
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]:     option dontlognull
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]:     option http-server-close
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]:     option forwardfor
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]:     retries                 3
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]:     timeout http-request    30s
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]:     timeout connect         30s
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]:     timeout client          32s
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]:     timeout server          32s
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]:     timeout http-keep-alive 30s
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]: 
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]: listen listener
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]:     bind 169.254.169.254:80
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]:     
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]: 
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]:     http-request add-header X-OVN-Network-ID c99c822b-3191-49e5-b938-903e25b4a9bb
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Sep 30 07:40:14 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:14.721 100322 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'env', 'PROCESS_TAG=haproxy-c99c822b-3191-49e5-b938-903e25b4a9bb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c99c822b-3191-49e5-b938-903e25b4a9bb.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Sep 30 07:40:14 compute-0 sshd-session[222611]: Received disconnect from 193.46.255.20 port 17062:11:  [preauth]
Sep 30 07:40:14 compute-0 sshd-session[222611]: Disconnected from authenticating user root 193.46.255.20 port 17062 [preauth]
Sep 30 07:40:14 compute-0 sshd-session[222611]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.20  user=root
Sep 30 07:40:15 compute-0 podman[222728]: 2025-09-30 07:40:15.248211042 +0000 UTC m=+0.077312387 container create fe157cb3a43a8bc4896aa619491217ae4f99a666f669f318694e295ec1056d64 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930)
Sep 30 07:40:15 compute-0 systemd[1]: Started libpod-conmon-fe157cb3a43a8bc4896aa619491217ae4f99a666f669f318694e295ec1056d64.scope.
Sep 30 07:40:15 compute-0 podman[222728]: 2025-09-30 07:40:15.21377258 +0000 UTC m=+0.042873925 image pull eeebcc09bc72f81ab45f5ab87eb8f6a7b554b949227aeec082bdb0732754ddc8 38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 07:40:15 compute-0 systemd[1]: Started libcrun container.
Sep 30 07:40:15 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff75452424700f7ca13b022c7e82d33f65adadd381e6a670d9a8afe61eab2f74/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 07:40:15 compute-0 podman[222728]: 2025-09-30 07:40:15.351933929 +0000 UTC m=+0.181035284 container init fe157cb3a43a8bc4896aa619491217ae4f99a666f669f318694e295ec1056d64 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 07:40:15 compute-0 podman[222728]: 2025-09-30 07:40:15.358495948 +0000 UTC m=+0.187597263 container start fe157cb3a43a8bc4896aa619491217ae4f99a666f669f318694e295ec1056d64 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930)
Sep 30 07:40:15 compute-0 neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb[222743]: [NOTICE]   (222747) : New worker (222749) forked
Sep 30 07:40:15 compute-0 neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb[222743]: [NOTICE]   (222747) : Loading success.
Sep 30 07:40:15 compute-0 unix_chkpwd[222771]: password check failed for user (root)
Sep 30 07:40:15 compute-0 sshd-session[222705]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.20  user=root
Sep 30 07:40:16 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:16.237 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=01429670-4ea1-4dab-babc-4bc628cc01bb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:40:16 compute-0 podman[222772]: 2025-09-30 07:40:16.511205745 +0000 UTC m=+0.085874744 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 07:40:17 compute-0 ovn_controller[91436]: 2025-09-30T07:40:17Z|00236|binding|INFO|Claiming lport 25a2d902-e837-49df-b614-07f054d068db for this chassis.
Sep 30 07:40:17 compute-0 ovn_controller[91436]: 2025-09-30T07:40:17Z|00237|binding|INFO|25a2d902-e837-49df-b614-07f054d068db: Claiming fa:16:3e:43:0a:c9 10.100.0.10
Sep 30 07:40:17 compute-0 ovn_controller[91436]: 2025-09-30T07:40:17Z|00238|binding|INFO|Setting lport 25a2d902-e837-49df-b614-07f054d068db up in Southbound
Sep 30 07:40:17 compute-0 sshd-session[222705]: Failed password for root from 193.46.255.20 port 11134 ssh2
Sep 30 07:40:18 compute-0 nova_compute[189265]: 2025-09-30 07:40:18.516 2 INFO nova.compute.manager [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c5599a24-d8b6-491b-a582-14ff5b98bd5d] Post operation of migration started
Sep 30 07:40:18 compute-0 nova_compute[189265]: 2025-09-30 07:40:18.517 2 WARNING neutronclient.v2_0.client [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:40:18 compute-0 nova_compute[189265]: 2025-09-30 07:40:18.973 2 WARNING neutronclient.v2_0.client [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:40:18 compute-0 nova_compute[189265]: 2025-09-30 07:40:18.974 2 WARNING neutronclient.v2_0.client [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:40:19 compute-0 nova_compute[189265]: 2025-09-30 07:40:19.106 2 DEBUG oslo_concurrency.lockutils [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "refresh_cache-c5599a24-d8b6-491b-a582-14ff5b98bd5d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:40:19 compute-0 nova_compute[189265]: 2025-09-30 07:40:19.107 2 DEBUG oslo_concurrency.lockutils [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquired lock "refresh_cache-c5599a24-d8b6-491b-a582-14ff5b98bd5d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:40:19 compute-0 nova_compute[189265]: 2025-09-30 07:40:19.107 2 DEBUG nova.network.neutron [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c5599a24-d8b6-491b-a582-14ff5b98bd5d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 07:40:19 compute-0 nova_compute[189265]: 2025-09-30 07:40:19.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:19 compute-0 nova_compute[189265]: 2025-09-30 07:40:19.530 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:19 compute-0 nova_compute[189265]: 2025-09-30 07:40:19.628 2 WARNING neutronclient.v2_0.client [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:40:19 compute-0 unix_chkpwd[222796]: password check failed for user (root)
Sep 30 07:40:20 compute-0 nova_compute[189265]: 2025-09-30 07:40:20.320 2 WARNING neutronclient.v2_0.client [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:40:20 compute-0 nova_compute[189265]: 2025-09-30 07:40:20.475 2 DEBUG nova.network.neutron [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c5599a24-d8b6-491b-a582-14ff5b98bd5d] Updating instance_info_cache with network_info: [{"id": "25a2d902-e837-49df-b614-07f054d068db", "address": "fa:16:3e:43:0a:c9", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25a2d902-e8", "ovs_interfaceid": "25a2d902-e837-49df-b614-07f054d068db", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:40:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:20.580 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:40:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:20.580 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:40:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:20.581 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:40:20 compute-0 nova_compute[189265]: 2025-09-30 07:40:20.984 2 DEBUG oslo_concurrency.lockutils [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Releasing lock "refresh_cache-c5599a24-d8b6-491b-a582-14ff5b98bd5d" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:40:21 compute-0 nova_compute[189265]: 2025-09-30 07:40:21.514 2 DEBUG oslo_concurrency.lockutils [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:40:21 compute-0 nova_compute[189265]: 2025-09-30 07:40:21.515 2 DEBUG oslo_concurrency.lockutils [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:40:21 compute-0 nova_compute[189265]: 2025-09-30 07:40:21.515 2 DEBUG oslo_concurrency.lockutils [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:40:21 compute-0 nova_compute[189265]: 2025-09-30 07:40:21.521 2 INFO nova.virt.libvirt.driver [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c5599a24-d8b6-491b-a582-14ff5b98bd5d] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Sep 30 07:40:21 compute-0 virtqemud[189090]: Domain id=19 name='instance-00000019' uuid=c5599a24-d8b6-491b-a582-14ff5b98bd5d is tainted: custom-monitor
Sep 30 07:40:21 compute-0 sshd-session[222705]: Failed password for root from 193.46.255.20 port 11134 ssh2
Sep 30 07:40:21 compute-0 unix_chkpwd[222798]: password check failed for user (root)
Sep 30 07:40:22 compute-0 nova_compute[189265]: 2025-09-30 07:40:22.529 2 INFO nova.virt.libvirt.driver [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c5599a24-d8b6-491b-a582-14ff5b98bd5d] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Sep 30 07:40:23 compute-0 sshd-session[222705]: Failed password for root from 193.46.255.20 port 11134 ssh2
Sep 30 07:40:23 compute-0 nova_compute[189265]: 2025-09-30 07:40:23.535 2 INFO nova.virt.libvirt.driver [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c5599a24-d8b6-491b-a582-14ff5b98bd5d] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Sep 30 07:40:23 compute-0 nova_compute[189265]: 2025-09-30 07:40:23.540 2 DEBUG nova.compute.manager [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c5599a24-d8b6-491b-a582-14ff5b98bd5d] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 07:40:24 compute-0 sshd-session[222705]: Received disconnect from 193.46.255.20 port 11134:11:  [preauth]
Sep 30 07:40:24 compute-0 sshd-session[222705]: Disconnected from authenticating user root 193.46.255.20 port 11134 [preauth]
Sep 30 07:40:24 compute-0 sshd-session[222705]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.20  user=root
Sep 30 07:40:24 compute-0 nova_compute[189265]: 2025-09-30 07:40:24.052 2 DEBUG nova.objects.instance [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c5599a24-d8b6-491b-a582-14ff5b98bd5d] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Sep 30 07:40:24 compute-0 nova_compute[189265]: 2025-09-30 07:40:24.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:24 compute-0 nova_compute[189265]: 2025-09-30 07:40:24.533 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:25 compute-0 nova_compute[189265]: 2025-09-30 07:40:25.071 2 WARNING neutronclient.v2_0.client [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:40:25 compute-0 nova_compute[189265]: 2025-09-30 07:40:25.274 2 WARNING neutronclient.v2_0.client [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:40:25 compute-0 nova_compute[189265]: 2025-09-30 07:40:25.275 2 WARNING neutronclient.v2_0.client [None req-ad481397-e09b-471a-aa83-5b07b58f4320 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:40:25 compute-0 podman[222799]: 2025-09-30 07:40:25.51966161 +0000 UTC m=+0.091222929 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.schema-version=1.0)
Sep 30 07:40:29 compute-0 nova_compute[189265]: 2025-09-30 07:40:29.332 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:29 compute-0 podman[222819]: 2025-09-30 07:40:29.515144195 +0000 UTC m=+0.088676115 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, config_id=edpm, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, architecture=x86_64, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, release=1755695350)
Sep 30 07:40:29 compute-0 nova_compute[189265]: 2025-09-30 07:40:29.535 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:29 compute-0 podman[199733]: time="2025-09-30T07:40:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:40:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:40:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20751 "" "Go-http-client/1.1"
Sep 30 07:40:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:40:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3479 "" "Go-http-client/1.1"
Sep 30 07:40:31 compute-0 openstack_network_exporter[201859]: ERROR   07:40:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:40:31 compute-0 openstack_network_exporter[201859]: ERROR   07:40:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:40:31 compute-0 openstack_network_exporter[201859]: ERROR   07:40:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:40:31 compute-0 openstack_network_exporter[201859]: ERROR   07:40:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:40:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:40:31 compute-0 openstack_network_exporter[201859]: ERROR   07:40:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:40:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:40:32 compute-0 podman[222841]: 2025-09-30 07:40:32.479242107 +0000 UTC m=+0.057011633 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Sep 30 07:40:32 compute-0 podman[222840]: 2025-09-30 07:40:32.479312959 +0000 UTC m=+0.066424674 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_managed=true)
Sep 30 07:40:32 compute-0 podman[222842]: 2025-09-30 07:40:32.511338851 +0000 UTC m=+0.095967725 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Sep 30 07:40:33 compute-0 nova_compute[189265]: 2025-09-30 07:40:33.278 2 DEBUG nova.compute.manager [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpnrr_wgku',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='dd9afa46-ab32-4a8e-861d-d825051c267a',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Sep 30 07:40:34 compute-0 nova_compute[189265]: 2025-09-30 07:40:34.295 2 DEBUG oslo_concurrency.lockutils [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "refresh_cache-dd9afa46-ab32-4a8e-861d-d825051c267a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:40:34 compute-0 nova_compute[189265]: 2025-09-30 07:40:34.295 2 DEBUG oslo_concurrency.lockutils [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquired lock "refresh_cache-dd9afa46-ab32-4a8e-861d-d825051c267a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:40:34 compute-0 nova_compute[189265]: 2025-09-30 07:40:34.296 2 DEBUG nova.network.neutron [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: dd9afa46-ab32-4a8e-861d-d825051c267a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 07:40:34 compute-0 nova_compute[189265]: 2025-09-30 07:40:34.333 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:34 compute-0 nova_compute[189265]: 2025-09-30 07:40:34.537 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:34 compute-0 nova_compute[189265]: 2025-09-30 07:40:34.802 2 WARNING neutronclient.v2_0.client [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:40:35 compute-0 nova_compute[189265]: 2025-09-30 07:40:35.946 2 WARNING neutronclient.v2_0.client [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:40:36 compute-0 nova_compute[189265]: 2025-09-30 07:40:36.137 2 DEBUG nova.network.neutron [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: dd9afa46-ab32-4a8e-861d-d825051c267a] Updating instance_info_cache with network_info: [{"id": "ac555adf-df44-4489-99a8-e2d32990877f", "address": "fa:16:3e:00:6b:06", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac555adf-df", "ovs_interfaceid": "ac555adf-df44-4489-99a8-e2d32990877f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:40:36 compute-0 nova_compute[189265]: 2025-09-30 07:40:36.644 2 DEBUG oslo_concurrency.lockutils [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Releasing lock "refresh_cache-dd9afa46-ab32-4a8e-861d-d825051c267a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:40:36 compute-0 nova_compute[189265]: 2025-09-30 07:40:36.662 2 DEBUG nova.virt.libvirt.driver [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: dd9afa46-ab32-4a8e-861d-d825051c267a] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpnrr_wgku',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='dd9afa46-ab32-4a8e-861d-d825051c267a',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Sep 30 07:40:36 compute-0 nova_compute[189265]: 2025-09-30 07:40:36.663 2 DEBUG nova.virt.libvirt.driver [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: dd9afa46-ab32-4a8e-861d-d825051c267a] Creating instance directory: /var/lib/nova/instances/dd9afa46-ab32-4a8e-861d-d825051c267a pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Sep 30 07:40:36 compute-0 nova_compute[189265]: 2025-09-30 07:40:36.664 2 DEBUG nova.virt.libvirt.driver [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: dd9afa46-ab32-4a8e-861d-d825051c267a] Creating disk.info with the contents: {'/var/lib/nova/instances/dd9afa46-ab32-4a8e-861d-d825051c267a/disk': 'qcow2', '/var/lib/nova/instances/dd9afa46-ab32-4a8e-861d-d825051c267a/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Sep 30 07:40:36 compute-0 nova_compute[189265]: 2025-09-30 07:40:36.665 2 DEBUG nova.virt.libvirt.driver [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: dd9afa46-ab32-4a8e-861d-d825051c267a] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Sep 30 07:40:36 compute-0 nova_compute[189265]: 2025-09-30 07:40:36.665 2 DEBUG nova.objects.instance [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lazy-loading 'trusted_certs' on Instance uuid dd9afa46-ab32-4a8e-861d-d825051c267a obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:40:37 compute-0 nova_compute[189265]: 2025-09-30 07:40:37.171 2 DEBUG oslo_utils.imageutils.format_inspector [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:40:37 compute-0 nova_compute[189265]: 2025-09-30 07:40:37.175 2 DEBUG oslo_utils.imageutils.format_inspector [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:40:37 compute-0 nova_compute[189265]: 2025-09-30 07:40:37.176 2 DEBUG oslo_concurrency.processutils [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:40:37 compute-0 nova_compute[189265]: 2025-09-30 07:40:37.243 2 DEBUG oslo_concurrency.processutils [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:40:37 compute-0 nova_compute[189265]: 2025-09-30 07:40:37.244 2 DEBUG oslo_concurrency.lockutils [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "649c128805005f3dfb5a93843c58a367cdfe939d" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:40:37 compute-0 nova_compute[189265]: 2025-09-30 07:40:37.244 2 DEBUG oslo_concurrency.lockutils [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "649c128805005f3dfb5a93843c58a367cdfe939d" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:40:37 compute-0 nova_compute[189265]: 2025-09-30 07:40:37.245 2 DEBUG oslo_utils.imageutils.format_inspector [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:40:37 compute-0 nova_compute[189265]: 2025-09-30 07:40:37.249 2 DEBUG oslo_utils.imageutils.format_inspector [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:40:37 compute-0 nova_compute[189265]: 2025-09-30 07:40:37.249 2 DEBUG oslo_concurrency.processutils [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:40:37 compute-0 nova_compute[189265]: 2025-09-30 07:40:37.331 2 DEBUG oslo_concurrency.processutils [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:40:37 compute-0 nova_compute[189265]: 2025-09-30 07:40:37.332 2 DEBUG oslo_concurrency.processutils [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d,backing_fmt=raw /var/lib/nova/instances/dd9afa46-ab32-4a8e-861d-d825051c267a/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:40:37 compute-0 nova_compute[189265]: 2025-09-30 07:40:37.368 2 DEBUG oslo_concurrency.processutils [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d,backing_fmt=raw /var/lib/nova/instances/dd9afa46-ab32-4a8e-861d-d825051c267a/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:40:37 compute-0 nova_compute[189265]: 2025-09-30 07:40:37.369 2 DEBUG oslo_concurrency.lockutils [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "649c128805005f3dfb5a93843c58a367cdfe939d" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.125s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:40:37 compute-0 nova_compute[189265]: 2025-09-30 07:40:37.370 2 DEBUG oslo_concurrency.processutils [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:40:37 compute-0 nova_compute[189265]: 2025-09-30 07:40:37.431 2 DEBUG oslo_concurrency.processutils [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:40:37 compute-0 nova_compute[189265]: 2025-09-30 07:40:37.433 2 DEBUG nova.virt.disk.api [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Checking if we can resize image /var/lib/nova/instances/dd9afa46-ab32-4a8e-861d-d825051c267a/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Sep 30 07:40:37 compute-0 nova_compute[189265]: 2025-09-30 07:40:37.433 2 DEBUG oslo_concurrency.processutils [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd9afa46-ab32-4a8e-861d-d825051c267a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:40:37 compute-0 nova_compute[189265]: 2025-09-30 07:40:37.482 2 DEBUG oslo_concurrency.processutils [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd9afa46-ab32-4a8e-861d-d825051c267a/disk --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:40:37 compute-0 nova_compute[189265]: 2025-09-30 07:40:37.483 2 DEBUG nova.virt.disk.api [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Cannot resize image /var/lib/nova/instances/dd9afa46-ab32-4a8e-861d-d825051c267a/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Sep 30 07:40:37 compute-0 nova_compute[189265]: 2025-09-30 07:40:37.483 2 DEBUG nova.objects.instance [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lazy-loading 'migration_context' on Instance uuid dd9afa46-ab32-4a8e-861d-d825051c267a obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:40:37 compute-0 nova_compute[189265]: 2025-09-30 07:40:37.995 2 DEBUG nova.objects.base [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Object Instance<dd9afa46-ab32-4a8e-861d-d825051c267a> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Sep 30 07:40:37 compute-0 nova_compute[189265]: 2025-09-30 07:40:37.995 2 DEBUG oslo_concurrency.processutils [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/dd9afa46-ab32-4a8e-861d-d825051c267a/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:40:38 compute-0 nova_compute[189265]: 2025-09-30 07:40:38.032 2 DEBUG oslo_concurrency.processutils [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/dd9afa46-ab32-4a8e-861d-d825051c267a/disk.config 497664" returned: 0 in 0.037s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:40:38 compute-0 nova_compute[189265]: 2025-09-30 07:40:38.033 2 DEBUG nova.virt.libvirt.driver [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: dd9afa46-ab32-4a8e-861d-d825051c267a] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Sep 30 07:40:38 compute-0 nova_compute[189265]: 2025-09-30 07:40:38.035 2 DEBUG nova.virt.libvirt.vif [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T07:38:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-2082264783',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-2082264783',id=24,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T07:39:04Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='6431607f3dce4c88bbf6d17ee6cd45b2',ramdisk_id='',reservation_id='r-6z70tk4s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1096120513',owner_user_name='tempest-TestExecuteStrategies-1096120513-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T07:39:04Z,user_data=None,user_id='89ba5d19014145188ad2a3c812acdc88',uuid=dd9afa46-ab32-4a8e-861d-d825051c267a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ac555adf-df44-4489-99a8-e2d32990877f", "address": "fa:16:3e:00:6b:06", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapac555adf-df", "ovs_interfaceid": "ac555adf-df44-4489-99a8-e2d32990877f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 07:40:38 compute-0 nova_compute[189265]: 2025-09-30 07:40:38.036 2 DEBUG nova.network.os_vif_util [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Converting VIF {"id": "ac555adf-df44-4489-99a8-e2d32990877f", "address": "fa:16:3e:00:6b:06", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapac555adf-df", "ovs_interfaceid": "ac555adf-df44-4489-99a8-e2d32990877f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:40:38 compute-0 nova_compute[189265]: 2025-09-30 07:40:38.037 2 DEBUG nova.network.os_vif_util [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:00:6b:06,bridge_name='br-int',has_traffic_filtering=True,id=ac555adf-df44-4489-99a8-e2d32990877f,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac555adf-df') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:40:38 compute-0 nova_compute[189265]: 2025-09-30 07:40:38.038 2 DEBUG os_vif [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:00:6b:06,bridge_name='br-int',has_traffic_filtering=True,id=ac555adf-df44-4489-99a8-e2d32990877f,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac555adf-df') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 07:40:38 compute-0 nova_compute[189265]: 2025-09-30 07:40:38.041 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:38 compute-0 nova_compute[189265]: 2025-09-30 07:40:38.042 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:40:38 compute-0 nova_compute[189265]: 2025-09-30 07:40:38.043 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:40:38 compute-0 nova_compute[189265]: 2025-09-30 07:40:38.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:38 compute-0 nova_compute[189265]: 2025-09-30 07:40:38.045 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '97e60f54-5834-54c4-82a1-2f48d9fb40a4', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:40:38 compute-0 nova_compute[189265]: 2025-09-30 07:40:38.047 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:38 compute-0 nova_compute[189265]: 2025-09-30 07:40:38.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:40:38 compute-0 nova_compute[189265]: 2025-09-30 07:40:38.054 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:38 compute-0 nova_compute[189265]: 2025-09-30 07:40:38.055 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapac555adf-df, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:40:38 compute-0 nova_compute[189265]: 2025-09-30 07:40:38.055 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapac555adf-df, col_values=(('qos', UUID('22fdd8e2-4c65-46d0-b363-0363e8ae2123')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:40:38 compute-0 nova_compute[189265]: 2025-09-30 07:40:38.056 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapac555adf-df, col_values=(('external_ids', {'iface-id': 'ac555adf-df44-4489-99a8-e2d32990877f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:00:6b:06', 'vm-uuid': 'dd9afa46-ab32-4a8e-861d-d825051c267a'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:40:38 compute-0 nova_compute[189265]: 2025-09-30 07:40:38.058 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:38 compute-0 NetworkManager[51813]: <info>  [1759218038.0597] manager: (tapac555adf-df): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/85)
Sep 30 07:40:38 compute-0 nova_compute[189265]: 2025-09-30 07:40:38.060 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:40:38 compute-0 nova_compute[189265]: 2025-09-30 07:40:38.069 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:38 compute-0 nova_compute[189265]: 2025-09-30 07:40:38.070 2 INFO os_vif [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:00:6b:06,bridge_name='br-int',has_traffic_filtering=True,id=ac555adf-df44-4489-99a8-e2d32990877f,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac555adf-df')
Sep 30 07:40:38 compute-0 nova_compute[189265]: 2025-09-30 07:40:38.070 2 DEBUG nova.virt.libvirt.driver [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Sep 30 07:40:38 compute-0 nova_compute[189265]: 2025-09-30 07:40:38.071 2 DEBUG nova.compute.manager [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpnrr_wgku',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='dd9afa46-ab32-4a8e-861d-d825051c267a',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Sep 30 07:40:38 compute-0 nova_compute[189265]: 2025-09-30 07:40:38.072 2 WARNING neutronclient.v2_0.client [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:40:38 compute-0 nova_compute[189265]: 2025-09-30 07:40:38.167 2 WARNING neutronclient.v2_0.client [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:40:39 compute-0 nova_compute[189265]: 2025-09-30 07:40:39.335 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:41 compute-0 nova_compute[189265]: 2025-09-30 07:40:41.139 2 DEBUG nova.network.neutron [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: dd9afa46-ab32-4a8e-861d-d825051c267a] Port ac555adf-df44-4489-99a8-e2d32990877f updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Sep 30 07:40:41 compute-0 nova_compute[189265]: 2025-09-30 07:40:41.155 2 DEBUG nova.compute.manager [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpnrr_wgku',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='dd9afa46-ab32-4a8e-861d-d825051c267a',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Sep 30 07:40:43 compute-0 nova_compute[189265]: 2025-09-30 07:40:43.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:44 compute-0 kernel: tapac555adf-df: entered promiscuous mode
Sep 30 07:40:44 compute-0 NetworkManager[51813]: <info>  [1759218044.0484] manager: (tapac555adf-df): new Tun device (/org/freedesktop/NetworkManager/Devices/86)
Sep 30 07:40:44 compute-0 ovn_controller[91436]: 2025-09-30T07:40:44Z|00239|binding|INFO|Claiming lport ac555adf-df44-4489-99a8-e2d32990877f for this additional chassis.
Sep 30 07:40:44 compute-0 nova_compute[189265]: 2025-09-30 07:40:44.049 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:44 compute-0 ovn_controller[91436]: 2025-09-30T07:40:44Z|00240|binding|INFO|ac555adf-df44-4489-99a8-e2d32990877f: Claiming fa:16:3e:00:6b:06 10.100.0.14
Sep 30 07:40:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:44.062 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:00:6b:06 10.100.0.14'], port_security=['fa:16:3e:00:6b:06 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'dd9afa46-ab32-4a8e-861d-d825051c267a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c99c822b-3191-49e5-b938-903e25b4a9bb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6431607f3dce4c88bbf6d17ee6cd45b2', 'neutron:revision_number': '10', 'neutron:security_group_ids': '39e9818d-6ede-4a3d-b6e2-a5ad3a4c803a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0bbcb02d-e040-4e0e-9a60-6466c4420133, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=ac555adf-df44-4489-99a8-e2d32990877f) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:40:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:44.064 100322 INFO neutron.agent.ovn.metadata.agent [-] Port ac555adf-df44-4489-99a8-e2d32990877f in datapath c99c822b-3191-49e5-b938-903e25b4a9bb unbound from our chassis
Sep 30 07:40:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:44.067 100322 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c99c822b-3191-49e5-b938-903e25b4a9bb
Sep 30 07:40:44 compute-0 ovn_controller[91436]: 2025-09-30T07:40:44Z|00241|binding|INFO|Setting lport ac555adf-df44-4489-99a8-e2d32990877f ovn-installed in OVS
Sep 30 07:40:44 compute-0 nova_compute[189265]: 2025-09-30 07:40:44.080 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:44 compute-0 nova_compute[189265]: 2025-09-30 07:40:44.083 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:44 compute-0 nova_compute[189265]: 2025-09-30 07:40:44.085 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:44 compute-0 systemd-udevd[222936]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 07:40:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:44.093 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[d919c589-3bd1-4290-b450-2933188831a0]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:40:44 compute-0 NetworkManager[51813]: <info>  [1759218044.1081] device (tapac555adf-df): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 07:40:44 compute-0 NetworkManager[51813]: <info>  [1759218044.1102] device (tapac555adf-df): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 07:40:44 compute-0 systemd-machined[149233]: New machine qemu-20-instance-00000018.
Sep 30 07:40:44 compute-0 systemd[1]: Started Virtual Machine qemu-20-instance-00000018.
Sep 30 07:40:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:44.137 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[05064827-0e4e-482b-a822-92d6605dd6cb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:40:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:44.141 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[54d42b05-ea50-46d9-a0ac-9f2b96c4dc0e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:40:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:44.186 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[df60fafd-3b5c-483c-a029-bbb5ab7a0253]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:40:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:44.214 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[2746b2af-6247-4286-a399-82cf25886bbb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc99c822b-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:67:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 28, 'tx_packets': 5, 'rx_bytes': 1672, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 28, 'tx_packets': 5, 'rx_bytes': 1672, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 59], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 587218, 'reachable_time': 24680, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222951, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:40:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:44.235 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[e0e2382b-0e55-42e7-bf71-b20d09eab6e9]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapc99c822b-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 587231, 'tstamp': 587231}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222953, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapc99c822b-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 587236, 'tstamp': 587236}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222953, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:40:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:44.236 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc99c822b-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:40:44 compute-0 nova_compute[189265]: 2025-09-30 07:40:44.238 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:44 compute-0 nova_compute[189265]: 2025-09-30 07:40:44.239 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:44.240 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc99c822b-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:40:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:44.240 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:40:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:44.240 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc99c822b-30, col_values=(('external_ids', {'iface-id': '67b7df48-3f38-444a-8506-1c0ec5bd1d15'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:40:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:44.241 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:40:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:44.242 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[74c41d23-ccf1-4a19-a7e3-6a94e7b57d11]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-c99c822b-3191-49e5-b938-903e25b4a9bb\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID c99c822b-3191-49e5-b938-903e25b4a9bb\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:40:44 compute-0 nova_compute[189265]: 2025-09-30 07:40:44.337 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:47 compute-0 ovn_controller[91436]: 2025-09-30T07:40:47Z|00242|binding|INFO|Claiming lport ac555adf-df44-4489-99a8-e2d32990877f for this chassis.
Sep 30 07:40:47 compute-0 ovn_controller[91436]: 2025-09-30T07:40:47Z|00243|binding|INFO|ac555adf-df44-4489-99a8-e2d32990877f: Claiming fa:16:3e:00:6b:06 10.100.0.14
Sep 30 07:40:47 compute-0 ovn_controller[91436]: 2025-09-30T07:40:47Z|00244|binding|INFO|Setting lport ac555adf-df44-4489-99a8-e2d32990877f up in Southbound
Sep 30 07:40:47 compute-0 podman[222975]: 2025-09-30 07:40:47.49939397 +0000 UTC m=+0.071313475 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 07:40:48 compute-0 nova_compute[189265]: 2025-09-30 07:40:48.061 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:48 compute-0 nova_compute[189265]: 2025-09-30 07:40:48.414 2 INFO nova.compute.manager [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: dd9afa46-ab32-4a8e-861d-d825051c267a] Post operation of migration started
Sep 30 07:40:48 compute-0 nova_compute[189265]: 2025-09-30 07:40:48.415 2 WARNING neutronclient.v2_0.client [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:40:48 compute-0 nova_compute[189265]: 2025-09-30 07:40:48.965 2 WARNING neutronclient.v2_0.client [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:40:48 compute-0 nova_compute[189265]: 2025-09-30 07:40:48.965 2 WARNING neutronclient.v2_0.client [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:40:49 compute-0 nova_compute[189265]: 2025-09-30 07:40:49.161 2 DEBUG oslo_concurrency.lockutils [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "refresh_cache-dd9afa46-ab32-4a8e-861d-d825051c267a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:40:49 compute-0 nova_compute[189265]: 2025-09-30 07:40:49.162 2 DEBUG oslo_concurrency.lockutils [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquired lock "refresh_cache-dd9afa46-ab32-4a8e-861d-d825051c267a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:40:49 compute-0 nova_compute[189265]: 2025-09-30 07:40:49.162 2 DEBUG nova.network.neutron [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: dd9afa46-ab32-4a8e-861d-d825051c267a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 07:40:49 compute-0 nova_compute[189265]: 2025-09-30 07:40:49.340 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:49 compute-0 nova_compute[189265]: 2025-09-30 07:40:49.670 2 WARNING neutronclient.v2_0.client [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:40:49 compute-0 nova_compute[189265]: 2025-09-30 07:40:49.783 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:40:50 compute-0 nova_compute[189265]: 2025-09-30 07:40:50.368 2 WARNING neutronclient.v2_0.client [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:40:50 compute-0 nova_compute[189265]: 2025-09-30 07:40:50.559 2 DEBUG nova.network.neutron [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: dd9afa46-ab32-4a8e-861d-d825051c267a] Updating instance_info_cache with network_info: [{"id": "ac555adf-df44-4489-99a8-e2d32990877f", "address": "fa:16:3e:00:6b:06", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac555adf-df", "ovs_interfaceid": "ac555adf-df44-4489-99a8-e2d32990877f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:40:50 compute-0 nova_compute[189265]: 2025-09-30 07:40:50.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:40:51 compute-0 nova_compute[189265]: 2025-09-30 07:40:51.068 2 DEBUG oslo_concurrency.lockutils [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Releasing lock "refresh_cache-dd9afa46-ab32-4a8e-861d-d825051c267a" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:40:51 compute-0 nova_compute[189265]: 2025-09-30 07:40:51.590 2 DEBUG oslo_concurrency.lockutils [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:40:51 compute-0 nova_compute[189265]: 2025-09-30 07:40:51.592 2 DEBUG oslo_concurrency.lockutils [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:40:51 compute-0 nova_compute[189265]: 2025-09-30 07:40:51.592 2 DEBUG oslo_concurrency.lockutils [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:40:51 compute-0 nova_compute[189265]: 2025-09-30 07:40:51.597 2 INFO nova.virt.libvirt.driver [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: dd9afa46-ab32-4a8e-861d-d825051c267a] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Sep 30 07:40:51 compute-0 virtqemud[189090]: Domain id=20 name='instance-00000018' uuid=dd9afa46-ab32-4a8e-861d-d825051c267a is tainted: custom-monitor
Sep 30 07:40:51 compute-0 nova_compute[189265]: 2025-09-30 07:40:51.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:40:52 compute-0 nova_compute[189265]: 2025-09-30 07:40:52.609 2 INFO nova.virt.libvirt.driver [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: dd9afa46-ab32-4a8e-861d-d825051c267a] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Sep 30 07:40:53 compute-0 nova_compute[189265]: 2025-09-30 07:40:53.062 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:53 compute-0 nova_compute[189265]: 2025-09-30 07:40:53.614 2 INFO nova.virt.libvirt.driver [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: dd9afa46-ab32-4a8e-861d-d825051c267a] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Sep 30 07:40:53 compute-0 nova_compute[189265]: 2025-09-30 07:40:53.620 2 DEBUG nova.compute.manager [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: dd9afa46-ab32-4a8e-861d-d825051c267a] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 07:40:53 compute-0 nova_compute[189265]: 2025-09-30 07:40:53.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:40:53 compute-0 nova_compute[189265]: 2025-09-30 07:40:53.788 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:40:54 compute-0 nova_compute[189265]: 2025-09-30 07:40:54.132 2 DEBUG nova.objects.instance [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: dd9afa46-ab32-4a8e-861d-d825051c267a] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Sep 30 07:40:54 compute-0 nova_compute[189265]: 2025-09-30 07:40:54.344 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:55 compute-0 nova_compute[189265]: 2025-09-30 07:40:55.153 2 WARNING neutronclient.v2_0.client [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:40:55 compute-0 nova_compute[189265]: 2025-09-30 07:40:55.258 2 WARNING neutronclient.v2_0.client [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:40:55 compute-0 nova_compute[189265]: 2025-09-30 07:40:55.259 2 WARNING neutronclient.v2_0.client [None req-b21a295e-3a5b-45f8-a7b2-99a228c7302a e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:40:56 compute-0 podman[222999]: 2025-09-30 07:40:56.486054833 +0000 UTC m=+0.065355673 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Sep 30 07:40:58 compute-0 nova_compute[189265]: 2025-09-30 07:40:58.042 2 DEBUG oslo_concurrency.lockutils [None req-0582ba1f-b178-4222-b60b-f51a5380e2d6 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquiring lock "c5599a24-d8b6-491b-a582-14ff5b98bd5d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:40:58 compute-0 nova_compute[189265]: 2025-09-30 07:40:58.043 2 DEBUG oslo_concurrency.lockutils [None req-0582ba1f-b178-4222-b60b-f51a5380e2d6 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "c5599a24-d8b6-491b-a582-14ff5b98bd5d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:40:58 compute-0 nova_compute[189265]: 2025-09-30 07:40:58.043 2 DEBUG oslo_concurrency.lockutils [None req-0582ba1f-b178-4222-b60b-f51a5380e2d6 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquiring lock "c5599a24-d8b6-491b-a582-14ff5b98bd5d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:40:58 compute-0 nova_compute[189265]: 2025-09-30 07:40:58.044 2 DEBUG oslo_concurrency.lockutils [None req-0582ba1f-b178-4222-b60b-f51a5380e2d6 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "c5599a24-d8b6-491b-a582-14ff5b98bd5d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:40:58 compute-0 nova_compute[189265]: 2025-09-30 07:40:58.044 2 DEBUG oslo_concurrency.lockutils [None req-0582ba1f-b178-4222-b60b-f51a5380e2d6 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "c5599a24-d8b6-491b-a582-14ff5b98bd5d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:40:58 compute-0 nova_compute[189265]: 2025-09-30 07:40:58.058 2 INFO nova.compute.manager [None req-0582ba1f-b178-4222-b60b-f51a5380e2d6 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: c5599a24-d8b6-491b-a582-14ff5b98bd5d] Terminating instance
Sep 30 07:40:58 compute-0 nova_compute[189265]: 2025-09-30 07:40:58.067 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:58 compute-0 nova_compute[189265]: 2025-09-30 07:40:58.577 2 DEBUG nova.compute.manager [None req-0582ba1f-b178-4222-b60b-f51a5380e2d6 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: c5599a24-d8b6-491b-a582-14ff5b98bd5d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 07:40:58 compute-0 kernel: tap25a2d902-e8 (unregistering): left promiscuous mode
Sep 30 07:40:58 compute-0 NetworkManager[51813]: <info>  [1759218058.6028] device (tap25a2d902-e8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 07:40:58 compute-0 ovn_controller[91436]: 2025-09-30T07:40:58Z|00245|binding|INFO|Releasing lport 25a2d902-e837-49df-b614-07f054d068db from this chassis (sb_readonly=0)
Sep 30 07:40:58 compute-0 nova_compute[189265]: 2025-09-30 07:40:58.620 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:58 compute-0 ovn_controller[91436]: 2025-09-30T07:40:58Z|00246|binding|INFO|Setting lport 25a2d902-e837-49df-b614-07f054d068db down in Southbound
Sep 30 07:40:58 compute-0 ovn_controller[91436]: 2025-09-30T07:40:58Z|00247|binding|INFO|Removing iface tap25a2d902-e8 ovn-installed in OVS
Sep 30 07:40:58 compute-0 nova_compute[189265]: 2025-09-30 07:40:58.624 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:58 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:58.633 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:43:0a:c9 10.100.0.10'], port_security=['fa:16:3e:43:0a:c9 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'c5599a24-d8b6-491b-a582-14ff5b98bd5d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c99c822b-3191-49e5-b938-903e25b4a9bb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6431607f3dce4c88bbf6d17ee6cd45b2', 'neutron:revision_number': '14', 'neutron:security_group_ids': '39e9818d-6ede-4a3d-b6e2-a5ad3a4c803a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0bbcb02d-e040-4e0e-9a60-6466c4420133, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], logical_port=25a2d902-e837-49df-b614-07f054d068db) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:40:58 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:58.635 100322 INFO neutron.agent.ovn.metadata.agent [-] Port 25a2d902-e837-49df-b614-07f054d068db in datapath c99c822b-3191-49e5-b938-903e25b4a9bb unbound from our chassis
Sep 30 07:40:58 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:58.637 100322 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c99c822b-3191-49e5-b938-903e25b4a9bb
Sep 30 07:40:58 compute-0 nova_compute[189265]: 2025-09-30 07:40:58.638 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:58 compute-0 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000019.scope: Deactivated successfully.
Sep 30 07:40:58 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:58.661 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[eeb76604-0a3b-4cee-8ec5-9be9419e9bd8]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:40:58 compute-0 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000019.scope: Consumed 3.581s CPU time.
Sep 30 07:40:58 compute-0 systemd-machined[149233]: Machine qemu-19-instance-00000019 terminated.
Sep 30 07:40:58 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:58.696 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[38aed6c0-4d75-47b5-82b5-0552812dac59]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:40:58 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:58.699 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[035a783e-9154-4972-8691-bd6ba1731f1f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:40:58 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:58.737 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[74bfb65b-1baa-41ab-b088-41378a2920a6]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:40:58 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:58.754 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[70bad740-590e-4ab9-8ead-f4fd50c632bc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc99c822b-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:67:8c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 48, 'tx_packets': 7, 'rx_bytes': 2512, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 48, 'tx_packets': 7, 'rx_bytes': 2512, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 59], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 587218, 'reachable_time': 43163, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223030, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:40:58 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:58.778 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[97076e0f-2751-475d-9ebb-9123482c902d]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapc99c822b-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 587231, 'tstamp': 587231}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223031, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapc99c822b-31'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 587236, 'tstamp': 587236}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223031, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:40:58 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:58.780 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc99c822b-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:40:58 compute-0 nova_compute[189265]: 2025-09-30 07:40:58.830 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:58 compute-0 nova_compute[189265]: 2025-09-30 07:40:58.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:58 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:58.842 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc99c822b-30, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:40:58 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:58.842 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:40:58 compute-0 nova_compute[189265]: 2025-09-30 07:40:58.841 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:58 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:58.843 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc99c822b-30, col_values=(('external_ids', {'iface-id': '67b7df48-3f38-444a-8506-1c0ec5bd1d15'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:40:58 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:58.843 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:40:58 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:40:58.845 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[dd7e2efa-3e63-49c3-8689-7b69e9f1af6f]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-c99c822b-3191-49e5-b938-903e25b4a9bb\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID c99c822b-3191-49e5-b938-903e25b4a9bb\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:40:58 compute-0 nova_compute[189265]: 2025-09-30 07:40:58.887 2 INFO nova.virt.libvirt.driver [-] [instance: c5599a24-d8b6-491b-a582-14ff5b98bd5d] Instance destroyed successfully.
Sep 30 07:40:58 compute-0 nova_compute[189265]: 2025-09-30 07:40:58.888 2 DEBUG nova.objects.instance [None req-0582ba1f-b178-4222-b60b-f51a5380e2d6 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lazy-loading 'resources' on Instance uuid c5599a24-d8b6-491b-a582-14ff5b98bd5d obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:40:59 compute-0 nova_compute[189265]: 2025-09-30 07:40:59.035 2 DEBUG nova.compute.manager [req-bd0bf5e0-4eb0-4977-8b65-f9b7e69a245c req-941afd07-0470-4ea0-8492-47a96d859d08 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c5599a24-d8b6-491b-a582-14ff5b98bd5d] Received event network-vif-unplugged-25a2d902-e837-49df-b614-07f054d068db external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:40:59 compute-0 nova_compute[189265]: 2025-09-30 07:40:59.035 2 DEBUG oslo_concurrency.lockutils [req-bd0bf5e0-4eb0-4977-8b65-f9b7e69a245c req-941afd07-0470-4ea0-8492-47a96d859d08 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "c5599a24-d8b6-491b-a582-14ff5b98bd5d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:40:59 compute-0 nova_compute[189265]: 2025-09-30 07:40:59.035 2 DEBUG oslo_concurrency.lockutils [req-bd0bf5e0-4eb0-4977-8b65-f9b7e69a245c req-941afd07-0470-4ea0-8492-47a96d859d08 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "c5599a24-d8b6-491b-a582-14ff5b98bd5d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:40:59 compute-0 nova_compute[189265]: 2025-09-30 07:40:59.036 2 DEBUG oslo_concurrency.lockutils [req-bd0bf5e0-4eb0-4977-8b65-f9b7e69a245c req-941afd07-0470-4ea0-8492-47a96d859d08 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "c5599a24-d8b6-491b-a582-14ff5b98bd5d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:40:59 compute-0 nova_compute[189265]: 2025-09-30 07:40:59.036 2 DEBUG nova.compute.manager [req-bd0bf5e0-4eb0-4977-8b65-f9b7e69a245c req-941afd07-0470-4ea0-8492-47a96d859d08 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c5599a24-d8b6-491b-a582-14ff5b98bd5d] No waiting events found dispatching network-vif-unplugged-25a2d902-e837-49df-b614-07f054d068db pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:40:59 compute-0 nova_compute[189265]: 2025-09-30 07:40:59.036 2 DEBUG nova.compute.manager [req-bd0bf5e0-4eb0-4977-8b65-f9b7e69a245c req-941afd07-0470-4ea0-8492-47a96d859d08 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c5599a24-d8b6-491b-a582-14ff5b98bd5d] Received event network-vif-unplugged-25a2d902-e837-49df-b614-07f054d068db for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:40:59 compute-0 nova_compute[189265]: 2025-09-30 07:40:59.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:59 compute-0 nova_compute[189265]: 2025-09-30 07:40:59.395 2 DEBUG nova.virt.libvirt.vif [None req-0582ba1f-b178-4222-b60b-f51a5380e2d6 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-09-30T07:39:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-848120778',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-848120778',id=25,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T07:39:28Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6431607f3dce4c88bbf6d17ee6cd45b2',ramdisk_id='',reservation_id='r-r5fp6y5j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',clean_attempts='1',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1096120513',owner_user_name='tempest-TestExecuteStrategies-1096120513-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T07:40:24Z,user_data=None,user_id='89ba5d19014145188ad2a3c812acdc88',uuid=c5599a24-d8b6-491b-a582-14ff5b98bd5d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "25a2d902-e837-49df-b614-07f054d068db", "address": "fa:16:3e:43:0a:c9", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25a2d902-e8", "ovs_interfaceid": "25a2d902-e837-49df-b614-07f054d068db", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 07:40:59 compute-0 nova_compute[189265]: 2025-09-30 07:40:59.395 2 DEBUG nova.network.os_vif_util [None req-0582ba1f-b178-4222-b60b-f51a5380e2d6 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Converting VIF {"id": "25a2d902-e837-49df-b614-07f054d068db", "address": "fa:16:3e:43:0a:c9", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap25a2d902-e8", "ovs_interfaceid": "25a2d902-e837-49df-b614-07f054d068db", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:40:59 compute-0 nova_compute[189265]: 2025-09-30 07:40:59.396 2 DEBUG nova.network.os_vif_util [None req-0582ba1f-b178-4222-b60b-f51a5380e2d6 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:43:0a:c9,bridge_name='br-int',has_traffic_filtering=True,id=25a2d902-e837-49df-b614-07f054d068db,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25a2d902-e8') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:40:59 compute-0 nova_compute[189265]: 2025-09-30 07:40:59.397 2 DEBUG os_vif [None req-0582ba1f-b178-4222-b60b-f51a5380e2d6 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:43:0a:c9,bridge_name='br-int',has_traffic_filtering=True,id=25a2d902-e837-49df-b614-07f054d068db,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25a2d902-e8') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 07:40:59 compute-0 nova_compute[189265]: 2025-09-30 07:40:59.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:59 compute-0 nova_compute[189265]: 2025-09-30 07:40:59.399 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap25a2d902-e8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:40:59 compute-0 nova_compute[189265]: 2025-09-30 07:40:59.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:59 compute-0 nova_compute[189265]: 2025-09-30 07:40:59.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:40:59 compute-0 nova_compute[189265]: 2025-09-30 07:40:59.403 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:59 compute-0 nova_compute[189265]: 2025-09-30 07:40:59.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:59 compute-0 nova_compute[189265]: 2025-09-30 07:40:59.404 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=fcae76ac-e549-43b3-8778-f33176e586e5) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:40:59 compute-0 nova_compute[189265]: 2025-09-30 07:40:59.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:59 compute-0 nova_compute[189265]: 2025-09-30 07:40:59.406 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:40:59 compute-0 nova_compute[189265]: 2025-09-30 07:40:59.406 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:40:59 compute-0 nova_compute[189265]: 2025-09-30 07:40:59.408 2 INFO os_vif [None req-0582ba1f-b178-4222-b60b-f51a5380e2d6 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:43:0a:c9,bridge_name='br-int',has_traffic_filtering=True,id=25a2d902-e837-49df-b614-07f054d068db,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap25a2d902-e8')
Sep 30 07:40:59 compute-0 nova_compute[189265]: 2025-09-30 07:40:59.408 2 INFO nova.virt.libvirt.driver [None req-0582ba1f-b178-4222-b60b-f51a5380e2d6 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: c5599a24-d8b6-491b-a582-14ff5b98bd5d] Deleting instance files /var/lib/nova/instances/c5599a24-d8b6-491b-a582-14ff5b98bd5d_del
Sep 30 07:40:59 compute-0 nova_compute[189265]: 2025-09-30 07:40:59.409 2 INFO nova.virt.libvirt.driver [None req-0582ba1f-b178-4222-b60b-f51a5380e2d6 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: c5599a24-d8b6-491b-a582-14ff5b98bd5d] Deletion of /var/lib/nova/instances/c5599a24-d8b6-491b-a582-14ff5b98bd5d_del complete
Sep 30 07:40:59 compute-0 podman[199733]: time="2025-09-30T07:40:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:40:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:40:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20751 "" "Go-http-client/1.1"
Sep 30 07:40:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:40:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3480 "" "Go-http-client/1.1"
Sep 30 07:40:59 compute-0 nova_compute[189265]: 2025-09-30 07:40:59.920 2 INFO nova.compute.manager [None req-0582ba1f-b178-4222-b60b-f51a5380e2d6 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: c5599a24-d8b6-491b-a582-14ff5b98bd5d] Took 1.34 seconds to destroy the instance on the hypervisor.
Sep 30 07:40:59 compute-0 nova_compute[189265]: 2025-09-30 07:40:59.922 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-0582ba1f-b178-4222-b60b-f51a5380e2d6 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 07:40:59 compute-0 nova_compute[189265]: 2025-09-30 07:40:59.922 2 DEBUG nova.compute.manager [-] [instance: c5599a24-d8b6-491b-a582-14ff5b98bd5d] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 07:40:59 compute-0 nova_compute[189265]: 2025-09-30 07:40:59.922 2 DEBUG nova.network.neutron [-] [instance: c5599a24-d8b6-491b-a582-14ff5b98bd5d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 07:40:59 compute-0 nova_compute[189265]: 2025-09-30 07:40:59.923 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:41:00 compute-0 nova_compute[189265]: 2025-09-30 07:41:00.356 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:41:00 compute-0 podman[223043]: 2025-09-30 07:41:00.512096568 +0000 UTC m=+0.083341791 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, maintainer=Red Hat, Inc., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.buildah.version=1.33.7, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_id=edpm, architecture=x86_64, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Sep 30 07:41:00 compute-0 nova_compute[189265]: 2025-09-30 07:41:00.706 2 DEBUG nova.compute.manager [req-9056efef-261e-4003-927a-f81ffc820b9a req-e6f2bf9d-3c6d-4780-a5b3-bfad16b5779a 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c5599a24-d8b6-491b-a582-14ff5b98bd5d] Received event network-vif-deleted-25a2d902-e837-49df-b614-07f054d068db external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:41:00 compute-0 nova_compute[189265]: 2025-09-30 07:41:00.706 2 INFO nova.compute.manager [req-9056efef-261e-4003-927a-f81ffc820b9a req-e6f2bf9d-3c6d-4780-a5b3-bfad16b5779a 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c5599a24-d8b6-491b-a582-14ff5b98bd5d] Neutron deleted interface 25a2d902-e837-49df-b614-07f054d068db; detaching it from the instance and deleting it from the info cache
Sep 30 07:41:00 compute-0 nova_compute[189265]: 2025-09-30 07:41:00.706 2 DEBUG nova.network.neutron [req-9056efef-261e-4003-927a-f81ffc820b9a req-e6f2bf9d-3c6d-4780-a5b3-bfad16b5779a 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c5599a24-d8b6-491b-a582-14ff5b98bd5d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:41:01 compute-0 nova_compute[189265]: 2025-09-30 07:41:01.092 2 DEBUG nova.compute.manager [req-e1d31d0e-2495-44c5-b191-94dc6fa2f548 req-8d852aed-01af-4569-b247-6911503c5dd4 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c5599a24-d8b6-491b-a582-14ff5b98bd5d] Received event network-vif-unplugged-25a2d902-e837-49df-b614-07f054d068db external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:41:01 compute-0 nova_compute[189265]: 2025-09-30 07:41:01.092 2 DEBUG oslo_concurrency.lockutils [req-e1d31d0e-2495-44c5-b191-94dc6fa2f548 req-8d852aed-01af-4569-b247-6911503c5dd4 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "c5599a24-d8b6-491b-a582-14ff5b98bd5d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:41:01 compute-0 nova_compute[189265]: 2025-09-30 07:41:01.093 2 DEBUG oslo_concurrency.lockutils [req-e1d31d0e-2495-44c5-b191-94dc6fa2f548 req-8d852aed-01af-4569-b247-6911503c5dd4 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "c5599a24-d8b6-491b-a582-14ff5b98bd5d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:41:01 compute-0 nova_compute[189265]: 2025-09-30 07:41:01.093 2 DEBUG oslo_concurrency.lockutils [req-e1d31d0e-2495-44c5-b191-94dc6fa2f548 req-8d852aed-01af-4569-b247-6911503c5dd4 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "c5599a24-d8b6-491b-a582-14ff5b98bd5d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:41:01 compute-0 nova_compute[189265]: 2025-09-30 07:41:01.094 2 DEBUG nova.compute.manager [req-e1d31d0e-2495-44c5-b191-94dc6fa2f548 req-8d852aed-01af-4569-b247-6911503c5dd4 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c5599a24-d8b6-491b-a582-14ff5b98bd5d] No waiting events found dispatching network-vif-unplugged-25a2d902-e837-49df-b614-07f054d068db pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:41:01 compute-0 nova_compute[189265]: 2025-09-30 07:41:01.094 2 DEBUG nova.compute.manager [req-e1d31d0e-2495-44c5-b191-94dc6fa2f548 req-8d852aed-01af-4569-b247-6911503c5dd4 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c5599a24-d8b6-491b-a582-14ff5b98bd5d] Received event network-vif-unplugged-25a2d902-e837-49df-b614-07f054d068db for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:41:01 compute-0 nova_compute[189265]: 2025-09-30 07:41:01.154 2 DEBUG nova.network.neutron [-] [instance: c5599a24-d8b6-491b-a582-14ff5b98bd5d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:41:01 compute-0 nova_compute[189265]: 2025-09-30 07:41:01.214 2 DEBUG nova.compute.manager [req-9056efef-261e-4003-927a-f81ffc820b9a req-e6f2bf9d-3c6d-4780-a5b3-bfad16b5779a 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: c5599a24-d8b6-491b-a582-14ff5b98bd5d] Detach interface failed, port_id=25a2d902-e837-49df-b614-07f054d068db, reason: Instance c5599a24-d8b6-491b-a582-14ff5b98bd5d could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Sep 30 07:41:01 compute-0 openstack_network_exporter[201859]: ERROR   07:41:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:41:01 compute-0 openstack_network_exporter[201859]: ERROR   07:41:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:41:01 compute-0 openstack_network_exporter[201859]: ERROR   07:41:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:41:01 compute-0 openstack_network_exporter[201859]: ERROR   07:41:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:41:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:41:01 compute-0 openstack_network_exporter[201859]: ERROR   07:41:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:41:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:41:01 compute-0 nova_compute[189265]: 2025-09-30 07:41:01.661 2 INFO nova.compute.manager [-] [instance: c5599a24-d8b6-491b-a582-14ff5b98bd5d] Took 1.74 seconds to deallocate network for instance.
Sep 30 07:41:01 compute-0 nova_compute[189265]: 2025-09-30 07:41:01.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:41:01 compute-0 nova_compute[189265]: 2025-09-30 07:41:01.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:41:02 compute-0 nova_compute[189265]: 2025-09-30 07:41:02.188 2 DEBUG oslo_concurrency.lockutils [None req-0582ba1f-b178-4222-b60b-f51a5380e2d6 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:41:02 compute-0 nova_compute[189265]: 2025-09-30 07:41:02.189 2 DEBUG oslo_concurrency.lockutils [None req-0582ba1f-b178-4222-b60b-f51a5380e2d6 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:41:02 compute-0 nova_compute[189265]: 2025-09-30 07:41:02.194 2 DEBUG oslo_concurrency.lockutils [None req-0582ba1f-b178-4222-b60b-f51a5380e2d6 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:41:02 compute-0 nova_compute[189265]: 2025-09-30 07:41:02.226 2 INFO nova.scheduler.client.report [None req-0582ba1f-b178-4222-b60b-f51a5380e2d6 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Deleted allocations for instance c5599a24-d8b6-491b-a582-14ff5b98bd5d
Sep 30 07:41:02 compute-0 nova_compute[189265]: 2025-09-30 07:41:02.301 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:41:02 compute-0 nova_compute[189265]: 2025-09-30 07:41:02.302 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:41:02 compute-0 nova_compute[189265]: 2025-09-30 07:41:02.302 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:41:02 compute-0 nova_compute[189265]: 2025-09-30 07:41:02.303 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:41:03 compute-0 nova_compute[189265]: 2025-09-30 07:41:03.269 2 DEBUG oslo_concurrency.lockutils [None req-0582ba1f-b178-4222-b60b-f51a5380e2d6 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "c5599a24-d8b6-491b-a582-14ff5b98bd5d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.226s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:41:03 compute-0 nova_compute[189265]: 2025-09-30 07:41:03.356 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd9afa46-ab32-4a8e-861d-d825051c267a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:41:03 compute-0 nova_compute[189265]: 2025-09-30 07:41:03.424 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd9afa46-ab32-4a8e-861d-d825051c267a/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:41:03 compute-0 nova_compute[189265]: 2025-09-30 07:41:03.426 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd9afa46-ab32-4a8e-861d-d825051c267a/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:41:03 compute-0 podman[223067]: 2025-09-30 07:41:03.509429167 +0000 UTC m=+0.082785905 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20250930)
Sep 30 07:41:03 compute-0 nova_compute[189265]: 2025-09-30 07:41:03.520 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/dd9afa46-ab32-4a8e-861d-d825051c267a/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:41:03 compute-0 podman[223066]: 2025-09-30 07:41:03.541158301 +0000 UTC m=+0.113000515 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd)
Sep 30 07:41:03 compute-0 podman[223068]: 2025-09-30 07:41:03.56578577 +0000 UTC m=+0.128314806 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 07:41:03 compute-0 nova_compute[189265]: 2025-09-30 07:41:03.657 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:41:03 compute-0 nova_compute[189265]: 2025-09-30 07:41:03.659 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:41:03 compute-0 nova_compute[189265]: 2025-09-30 07:41:03.681 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.022s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:41:03 compute-0 nova_compute[189265]: 2025-09-30 07:41:03.682 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5691MB free_disk=73.27482986450195GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:41:03 compute-0 nova_compute[189265]: 2025-09-30 07:41:03.682 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:41:03 compute-0 nova_compute[189265]: 2025-09-30 07:41:03.682 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:41:03 compute-0 nova_compute[189265]: 2025-09-30 07:41:03.698 2 DEBUG oslo_concurrency.lockutils [None req-8fd3a1e6-27f5-41d5-b8bb-4eec336e0523 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquiring lock "dd9afa46-ab32-4a8e-861d-d825051c267a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:41:03 compute-0 nova_compute[189265]: 2025-09-30 07:41:03.699 2 DEBUG oslo_concurrency.lockutils [None req-8fd3a1e6-27f5-41d5-b8bb-4eec336e0523 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "dd9afa46-ab32-4a8e-861d-d825051c267a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:41:03 compute-0 nova_compute[189265]: 2025-09-30 07:41:03.699 2 DEBUG oslo_concurrency.lockutils [None req-8fd3a1e6-27f5-41d5-b8bb-4eec336e0523 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquiring lock "dd9afa46-ab32-4a8e-861d-d825051c267a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:41:03 compute-0 nova_compute[189265]: 2025-09-30 07:41:03.700 2 DEBUG oslo_concurrency.lockutils [None req-8fd3a1e6-27f5-41d5-b8bb-4eec336e0523 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "dd9afa46-ab32-4a8e-861d-d825051c267a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:41:03 compute-0 nova_compute[189265]: 2025-09-30 07:41:03.700 2 DEBUG oslo_concurrency.lockutils [None req-8fd3a1e6-27f5-41d5-b8bb-4eec336e0523 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "dd9afa46-ab32-4a8e-861d-d825051c267a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:41:03 compute-0 nova_compute[189265]: 2025-09-30 07:41:03.713 2 INFO nova.compute.manager [None req-8fd3a1e6-27f5-41d5-b8bb-4eec336e0523 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: dd9afa46-ab32-4a8e-861d-d825051c267a] Terminating instance
Sep 30 07:41:04 compute-0 nova_compute[189265]: 2025-09-30 07:41:04.233 2 DEBUG nova.compute.manager [None req-8fd3a1e6-27f5-41d5-b8bb-4eec336e0523 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: dd9afa46-ab32-4a8e-861d-d825051c267a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 07:41:04 compute-0 kernel: tapac555adf-df (unregistering): left promiscuous mode
Sep 30 07:41:04 compute-0 NetworkManager[51813]: <info>  [1759218064.2608] device (tapac555adf-df): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 07:41:04 compute-0 nova_compute[189265]: 2025-09-30 07:41:04.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:41:04 compute-0 ovn_controller[91436]: 2025-09-30T07:41:04Z|00248|binding|INFO|Releasing lport ac555adf-df44-4489-99a8-e2d32990877f from this chassis (sb_readonly=0)
Sep 30 07:41:04 compute-0 ovn_controller[91436]: 2025-09-30T07:41:04Z|00249|binding|INFO|Setting lport ac555adf-df44-4489-99a8-e2d32990877f down in Southbound
Sep 30 07:41:04 compute-0 ovn_controller[91436]: 2025-09-30T07:41:04Z|00250|binding|INFO|Removing iface tapac555adf-df ovn-installed in OVS
Sep 30 07:41:04 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:41:04.279 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:00:6b:06 10.100.0.14'], port_security=['fa:16:3e:00:6b:06 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'dd9afa46-ab32-4a8e-861d-d825051c267a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c99c822b-3191-49e5-b938-903e25b4a9bb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6431607f3dce4c88bbf6d17ee6cd45b2', 'neutron:revision_number': '15', 'neutron:security_group_ids': '39e9818d-6ede-4a3d-b6e2-a5ad3a4c803a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0bbcb02d-e040-4e0e-9a60-6466c4420133, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], logical_port=ac555adf-df44-4489-99a8-e2d32990877f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:41:04 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:41:04.282 100322 INFO neutron.agent.ovn.metadata.agent [-] Port ac555adf-df44-4489-99a8-e2d32990877f in datapath c99c822b-3191-49e5-b938-903e25b4a9bb unbound from our chassis
Sep 30 07:41:04 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:41:04.285 100322 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c99c822b-3191-49e5-b938-903e25b4a9bb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 07:41:04 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:41:04.289 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[69f80d9f-e6b9-4adb-bfe6-e86c23d50d59]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:41:04 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:41:04.291 100322 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb namespace which is not needed anymore
Sep 30 07:41:04 compute-0 nova_compute[189265]: 2025-09-30 07:41:04.303 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:41:04 compute-0 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000018.scope: Deactivated successfully.
Sep 30 07:41:04 compute-0 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000018.scope: Consumed 2.564s CPU time.
Sep 30 07:41:04 compute-0 systemd-machined[149233]: Machine qemu-20-instance-00000018 terminated.
Sep 30 07:41:04 compute-0 nova_compute[189265]: 2025-09-30 07:41:04.350 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:41:04 compute-0 nova_compute[189265]: 2025-09-30 07:41:04.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:41:04 compute-0 nova_compute[189265]: 2025-09-30 07:41:04.463 2 DEBUG nova.compute.manager [req-85f5bdc0-1307-4827-9a33-5752e075e0b1 req-8bfdd57c-00ee-4fbc-bd85-cceced7f3ec0 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: dd9afa46-ab32-4a8e-861d-d825051c267a] Received event network-vif-unplugged-ac555adf-df44-4489-99a8-e2d32990877f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:41:04 compute-0 nova_compute[189265]: 2025-09-30 07:41:04.464 2 DEBUG oslo_concurrency.lockutils [req-85f5bdc0-1307-4827-9a33-5752e075e0b1 req-8bfdd57c-00ee-4fbc-bd85-cceced7f3ec0 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "dd9afa46-ab32-4a8e-861d-d825051c267a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:41:04 compute-0 nova_compute[189265]: 2025-09-30 07:41:04.464 2 DEBUG oslo_concurrency.lockutils [req-85f5bdc0-1307-4827-9a33-5752e075e0b1 req-8bfdd57c-00ee-4fbc-bd85-cceced7f3ec0 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "dd9afa46-ab32-4a8e-861d-d825051c267a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:41:04 compute-0 nova_compute[189265]: 2025-09-30 07:41:04.464 2 DEBUG oslo_concurrency.lockutils [req-85f5bdc0-1307-4827-9a33-5752e075e0b1 req-8bfdd57c-00ee-4fbc-bd85-cceced7f3ec0 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "dd9afa46-ab32-4a8e-861d-d825051c267a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:41:04 compute-0 nova_compute[189265]: 2025-09-30 07:41:04.465 2 DEBUG nova.compute.manager [req-85f5bdc0-1307-4827-9a33-5752e075e0b1 req-8bfdd57c-00ee-4fbc-bd85-cceced7f3ec0 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: dd9afa46-ab32-4a8e-861d-d825051c267a] No waiting events found dispatching network-vif-unplugged-ac555adf-df44-4489-99a8-e2d32990877f pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:41:04 compute-0 nova_compute[189265]: 2025-09-30 07:41:04.465 2 DEBUG nova.compute.manager [req-85f5bdc0-1307-4827-9a33-5752e075e0b1 req-8bfdd57c-00ee-4fbc-bd85-cceced7f3ec0 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: dd9afa46-ab32-4a8e-861d-d825051c267a] Received event network-vif-unplugged-ac555adf-df44-4489-99a8-e2d32990877f for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:41:04 compute-0 neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb[222743]: [NOTICE]   (222747) : haproxy version is 3.0.5-8e879a5
Sep 30 07:41:04 compute-0 neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb[222743]: [NOTICE]   (222747) : path to executable is /usr/sbin/haproxy
Sep 30 07:41:04 compute-0 neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb[222743]: [WARNING]  (222747) : Exiting Master process...
Sep 30 07:41:04 compute-0 podman[223161]: 2025-09-30 07:41:04.474904432 +0000 UTC m=+0.049891008 container kill fe157cb3a43a8bc4896aa619491217ae4f99a666f669f318694e295ec1056d64 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4)
Sep 30 07:41:04 compute-0 neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb[222743]: [ALERT]    (222747) : Current worker (222749) exited with code 143 (Terminated)
Sep 30 07:41:04 compute-0 neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb[222743]: [WARNING]  (222747) : All workers exited. Exiting... (0)
Sep 30 07:41:04 compute-0 systemd[1]: libpod-fe157cb3a43a8bc4896aa619491217ae4f99a666f669f318694e295ec1056d64.scope: Deactivated successfully.
Sep 30 07:41:04 compute-0 nova_compute[189265]: 2025-09-30 07:41:04.510 2 INFO nova.virt.libvirt.driver [-] [instance: dd9afa46-ab32-4a8e-861d-d825051c267a] Instance destroyed successfully.
Sep 30 07:41:04 compute-0 nova_compute[189265]: 2025-09-30 07:41:04.510 2 DEBUG nova.objects.instance [None req-8fd3a1e6-27f5-41d5-b8bb-4eec336e0523 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lazy-loading 'resources' on Instance uuid dd9afa46-ab32-4a8e-861d-d825051c267a obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:41:04 compute-0 podman[223188]: 2025-09-30 07:41:04.540030358 +0000 UTC m=+0.038290504 container died fe157cb3a43a8bc4896aa619491217ae4f99a666f669f318694e295ec1056d64 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 07:41:04 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fe157cb3a43a8bc4896aa619491217ae4f99a666f669f318694e295ec1056d64-userdata-shm.mount: Deactivated successfully.
Sep 30 07:41:04 compute-0 systemd[1]: var-lib-containers-storage-overlay-ff75452424700f7ca13b022c7e82d33f65adadd381e6a670d9a8afe61eab2f74-merged.mount: Deactivated successfully.
Sep 30 07:41:04 compute-0 podman[223188]: 2025-09-30 07:41:04.58281488 +0000 UTC m=+0.081074966 container cleanup fe157cb3a43a8bc4896aa619491217ae4f99a666f669f318694e295ec1056d64 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 07:41:04 compute-0 systemd[1]: libpod-conmon-fe157cb3a43a8bc4896aa619491217ae4f99a666f669f318694e295ec1056d64.scope: Deactivated successfully.
Sep 30 07:41:04 compute-0 podman[223194]: 2025-09-30 07:41:04.607081079 +0000 UTC m=+0.087244604 container remove fe157cb3a43a8bc4896aa619491217ae4f99a666f669f318694e295ec1056d64 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4)
Sep 30 07:41:04 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:41:04.613 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[dc52a4f5-2e97-4dfc-9c28-3ddb04c488ff]: (4, ("Tue Sep 30 07:41:04 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb (fe157cb3a43a8bc4896aa619491217ae4f99a666f669f318694e295ec1056d64)\nfe157cb3a43a8bc4896aa619491217ae4f99a666f669f318694e295ec1056d64\nTue Sep 30 07:41:04 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb (fe157cb3a43a8bc4896aa619491217ae4f99a666f669f318694e295ec1056d64)\nfe157cb3a43a8bc4896aa619491217ae4f99a666f669f318694e295ec1056d64\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:41:04 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:41:04.614 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[c65cedca-039c-4a13-b3d0-8c4ea4dc395a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:41:04 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:41:04.614 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c99c822b-3191-49e5-b938-903e25b4a9bb.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:41:04 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:41:04.615 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[939543a4-57dd-47ef-8e0d-90eb035bc308]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:41:04 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:41:04.616 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc99c822b-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:41:04 compute-0 nova_compute[189265]: 2025-09-30 07:41:04.617 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:41:04 compute-0 kernel: tapc99c822b-30: left promiscuous mode
Sep 30 07:41:04 compute-0 nova_compute[189265]: 2025-09-30 07:41:04.632 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:41:04 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:41:04.635 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[fed7b94e-7eef-4c1f-83ec-a949f358c37b]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:41:04 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:41:04.666 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[67154464-6d73-4fc8-9f66-18f4aac370a9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:41:04 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:41:04.668 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[9a1e4669-9d7c-415c-8e05-c95041376d5f]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:41:04 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:41:04.687 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[7e91fc21-74d1-47d2-869b-3e422f3a6fab]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 587209, 'reachable_time': 35668, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223229, 'error': None, 'target': 'ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:41:04 compute-0 systemd[1]: run-netns-ovnmeta\x2dc99c822b\x2d3191\x2d49e5\x2db938\x2d903e25b4a9bb.mount: Deactivated successfully.
Sep 30 07:41:04 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:41:04.692 100440 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c99c822b-3191-49e5-b938-903e25b4a9bb deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Sep 30 07:41:04 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:41:04.692 100440 DEBUG oslo.privsep.daemon [-] privsep: reply[1ee4297b-cab1-4831-9d5c-9be0b8a014c9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:41:05 compute-0 nova_compute[189265]: 2025-09-30 07:41:05.018 2 DEBUG nova.virt.libvirt.vif [None req-8fd3a1e6-27f5-41d5-b8bb-4eec336e0523 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-09-30T07:38:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteStrategies-server-2082264783',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutestrategies-server-2082264783',id=24,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T07:39:04Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6431607f3dce4c88bbf6d17ee6cd45b2',ramdisk_id='',reservation_id='r-6z70tk4s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',clean_attempts='1',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteStrategies-1096120513',owner_user_name='tempest-TestExecuteStrategies-1096120513-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T07:40:54Z,user_data=None,user_id='89ba5d19014145188ad2a3c812acdc88',uuid=dd9afa46-ab32-4a8e-861d-d825051c267a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ac555adf-df44-4489-99a8-e2d32990877f", "address": "fa:16:3e:00:6b:06", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac555adf-df", "ovs_interfaceid": "ac555adf-df44-4489-99a8-e2d32990877f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 07:41:05 compute-0 nova_compute[189265]: 2025-09-30 07:41:05.019 2 DEBUG nova.network.os_vif_util [None req-8fd3a1e6-27f5-41d5-b8bb-4eec336e0523 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Converting VIF {"id": "ac555adf-df44-4489-99a8-e2d32990877f", "address": "fa:16:3e:00:6b:06", "network": {"id": "c99c822b-3191-49e5-b938-903e25b4a9bb", "bridge": "br-int", "label": "tempest-TestExecuteStrategies-1158349361-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61ab665f922649eba82c57a34e0b452b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapac555adf-df", "ovs_interfaceid": "ac555adf-df44-4489-99a8-e2d32990877f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:41:05 compute-0 nova_compute[189265]: 2025-09-30 07:41:05.020 2 DEBUG nova.network.os_vif_util [None req-8fd3a1e6-27f5-41d5-b8bb-4eec336e0523 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:00:6b:06,bridge_name='br-int',has_traffic_filtering=True,id=ac555adf-df44-4489-99a8-e2d32990877f,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac555adf-df') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:41:05 compute-0 nova_compute[189265]: 2025-09-30 07:41:05.021 2 DEBUG os_vif [None req-8fd3a1e6-27f5-41d5-b8bb-4eec336e0523 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:00:6b:06,bridge_name='br-int',has_traffic_filtering=True,id=ac555adf-df44-4489-99a8-e2d32990877f,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac555adf-df') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 07:41:05 compute-0 nova_compute[189265]: 2025-09-30 07:41:05.023 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:41:05 compute-0 nova_compute[189265]: 2025-09-30 07:41:05.024 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapac555adf-df, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:41:05 compute-0 nova_compute[189265]: 2025-09-30 07:41:05.026 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:41:05 compute-0 nova_compute[189265]: 2025-09-30 07:41:05.028 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:41:05 compute-0 nova_compute[189265]: 2025-09-30 07:41:05.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:41:05 compute-0 nova_compute[189265]: 2025-09-30 07:41:05.030 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=22fdd8e2-4c65-46d0-b363-0363e8ae2123) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:41:05 compute-0 nova_compute[189265]: 2025-09-30 07:41:05.030 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:41:05 compute-0 nova_compute[189265]: 2025-09-30 07:41:05.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:41:05 compute-0 nova_compute[189265]: 2025-09-30 07:41:05.035 2 INFO os_vif [None req-8fd3a1e6-27f5-41d5-b8bb-4eec336e0523 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:00:6b:06,bridge_name='br-int',has_traffic_filtering=True,id=ac555adf-df44-4489-99a8-e2d32990877f,network=Network(c99c822b-3191-49e5-b938-903e25b4a9bb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapac555adf-df')
Sep 30 07:41:05 compute-0 nova_compute[189265]: 2025-09-30 07:41:05.035 2 INFO nova.virt.libvirt.driver [None req-8fd3a1e6-27f5-41d5-b8bb-4eec336e0523 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: dd9afa46-ab32-4a8e-861d-d825051c267a] Deleting instance files /var/lib/nova/instances/dd9afa46-ab32-4a8e-861d-d825051c267a_del
Sep 30 07:41:05 compute-0 nova_compute[189265]: 2025-09-30 07:41:05.036 2 INFO nova.virt.libvirt.driver [None req-8fd3a1e6-27f5-41d5-b8bb-4eec336e0523 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: dd9afa46-ab32-4a8e-861d-d825051c267a] Deletion of /var/lib/nova/instances/dd9afa46-ab32-4a8e-861d-d825051c267a_del complete
Sep 30 07:41:05 compute-0 nova_compute[189265]: 2025-09-30 07:41:05.240 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Instance dd9afa46-ab32-4a8e-861d-d825051c267a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 07:41:05 compute-0 nova_compute[189265]: 2025-09-30 07:41:05.241 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:41:05 compute-0 nova_compute[189265]: 2025-09-30 07:41:05.241 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:41:03 up  1:38,  0 user,  load average: 0.65, 0.39, 0.32\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_deleting': '1', 'num_os_type_None': '1', 'num_proj_6431607f3dce4c88bbf6d17ee6cd45b2': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:41:05 compute-0 nova_compute[189265]: 2025-09-30 07:41:05.286 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:41:05 compute-0 nova_compute[189265]: 2025-09-30 07:41:05.550 2 INFO nova.compute.manager [None req-8fd3a1e6-27f5-41d5-b8bb-4eec336e0523 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] [instance: dd9afa46-ab32-4a8e-861d-d825051c267a] Took 1.32 seconds to destroy the instance on the hypervisor.
Sep 30 07:41:05 compute-0 nova_compute[189265]: 2025-09-30 07:41:05.551 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-8fd3a1e6-27f5-41d5-b8bb-4eec336e0523 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 07:41:05 compute-0 nova_compute[189265]: 2025-09-30 07:41:05.551 2 DEBUG nova.compute.manager [-] [instance: dd9afa46-ab32-4a8e-861d-d825051c267a] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 07:41:05 compute-0 nova_compute[189265]: 2025-09-30 07:41:05.551 2 DEBUG nova.network.neutron [-] [instance: dd9afa46-ab32-4a8e-861d-d825051c267a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 07:41:05 compute-0 nova_compute[189265]: 2025-09-30 07:41:05.551 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:41:05 compute-0 nova_compute[189265]: 2025-09-30 07:41:05.793 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:41:05 compute-0 nova_compute[189265]: 2025-09-30 07:41:05.956 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:41:06 compute-0 nova_compute[189265]: 2025-09-30 07:41:06.302 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:41:06 compute-0 nova_compute[189265]: 2025-09-30 07:41:06.302 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.620s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:41:06 compute-0 nova_compute[189265]: 2025-09-30 07:41:06.549 2 DEBUG nova.compute.manager [req-75cdea49-4c0b-4b46-8c0e-db81a49e321b req-4f514093-e6b8-424b-bd6f-a051766a8d3c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: dd9afa46-ab32-4a8e-861d-d825051c267a] Received event network-vif-unplugged-ac555adf-df44-4489-99a8-e2d32990877f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:41:06 compute-0 nova_compute[189265]: 2025-09-30 07:41:06.550 2 DEBUG oslo_concurrency.lockutils [req-75cdea49-4c0b-4b46-8c0e-db81a49e321b req-4f514093-e6b8-424b-bd6f-a051766a8d3c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "dd9afa46-ab32-4a8e-861d-d825051c267a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:41:06 compute-0 nova_compute[189265]: 2025-09-30 07:41:06.551 2 DEBUG oslo_concurrency.lockutils [req-75cdea49-4c0b-4b46-8c0e-db81a49e321b req-4f514093-e6b8-424b-bd6f-a051766a8d3c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "dd9afa46-ab32-4a8e-861d-d825051c267a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:41:06 compute-0 nova_compute[189265]: 2025-09-30 07:41:06.551 2 DEBUG oslo_concurrency.lockutils [req-75cdea49-4c0b-4b46-8c0e-db81a49e321b req-4f514093-e6b8-424b-bd6f-a051766a8d3c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "dd9afa46-ab32-4a8e-861d-d825051c267a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:41:06 compute-0 nova_compute[189265]: 2025-09-30 07:41:06.552 2 DEBUG nova.compute.manager [req-75cdea49-4c0b-4b46-8c0e-db81a49e321b req-4f514093-e6b8-424b-bd6f-a051766a8d3c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: dd9afa46-ab32-4a8e-861d-d825051c267a] No waiting events found dispatching network-vif-unplugged-ac555adf-df44-4489-99a8-e2d32990877f pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:41:06 compute-0 nova_compute[189265]: 2025-09-30 07:41:06.552 2 DEBUG nova.compute.manager [req-75cdea49-4c0b-4b46-8c0e-db81a49e321b req-4f514093-e6b8-424b-bd6f-a051766a8d3c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: dd9afa46-ab32-4a8e-861d-d825051c267a] Received event network-vif-unplugged-ac555adf-df44-4489-99a8-e2d32990877f for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:41:06 compute-0 nova_compute[189265]: 2025-09-30 07:41:06.552 2 DEBUG nova.compute.manager [req-75cdea49-4c0b-4b46-8c0e-db81a49e321b req-4f514093-e6b8-424b-bd6f-a051766a8d3c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: dd9afa46-ab32-4a8e-861d-d825051c267a] Received event network-vif-deleted-ac555adf-df44-4489-99a8-e2d32990877f external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:41:06 compute-0 nova_compute[189265]: 2025-09-30 07:41:06.553 2 INFO nova.compute.manager [req-75cdea49-4c0b-4b46-8c0e-db81a49e321b req-4f514093-e6b8-424b-bd6f-a051766a8d3c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: dd9afa46-ab32-4a8e-861d-d825051c267a] Neutron deleted interface ac555adf-df44-4489-99a8-e2d32990877f; detaching it from the instance and deleting it from the info cache
Sep 30 07:41:06 compute-0 nova_compute[189265]: 2025-09-30 07:41:06.553 2 DEBUG nova.network.neutron [req-75cdea49-4c0b-4b46-8c0e-db81a49e321b req-4f514093-e6b8-424b-bd6f-a051766a8d3c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: dd9afa46-ab32-4a8e-861d-d825051c267a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:41:06 compute-0 nova_compute[189265]: 2025-09-30 07:41:06.734 2 DEBUG nova.network.neutron [-] [instance: dd9afa46-ab32-4a8e-861d-d825051c267a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:41:07 compute-0 nova_compute[189265]: 2025-09-30 07:41:07.063 2 DEBUG nova.compute.manager [req-75cdea49-4c0b-4b46-8c0e-db81a49e321b req-4f514093-e6b8-424b-bd6f-a051766a8d3c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: dd9afa46-ab32-4a8e-861d-d825051c267a] Detach interface failed, port_id=ac555adf-df44-4489-99a8-e2d32990877f, reason: Instance dd9afa46-ab32-4a8e-861d-d825051c267a could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Sep 30 07:41:07 compute-0 nova_compute[189265]: 2025-09-30 07:41:07.241 2 INFO nova.compute.manager [-] [instance: dd9afa46-ab32-4a8e-861d-d825051c267a] Took 1.69 seconds to deallocate network for instance.
Sep 30 07:41:07 compute-0 nova_compute[189265]: 2025-09-30 07:41:07.302 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:41:07 compute-0 nova_compute[189265]: 2025-09-30 07:41:07.762 2 DEBUG oslo_concurrency.lockutils [None req-8fd3a1e6-27f5-41d5-b8bb-4eec336e0523 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:41:07 compute-0 nova_compute[189265]: 2025-09-30 07:41:07.763 2 DEBUG oslo_concurrency.lockutils [None req-8fd3a1e6-27f5-41d5-b8bb-4eec336e0523 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:41:07 compute-0 nova_compute[189265]: 2025-09-30 07:41:07.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:41:07 compute-0 nova_compute[189265]: 2025-09-30 07:41:07.817 2 DEBUG nova.compute.provider_tree [None req-8fd3a1e6-27f5-41d5-b8bb-4eec336e0523 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:41:08 compute-0 nova_compute[189265]: 2025-09-30 07:41:08.324 2 DEBUG nova.scheduler.client.report [None req-8fd3a1e6-27f5-41d5-b8bb-4eec336e0523 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:41:08 compute-0 nova_compute[189265]: 2025-09-30 07:41:08.836 2 DEBUG oslo_concurrency.lockutils [None req-8fd3a1e6-27f5-41d5-b8bb-4eec336e0523 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.073s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:41:08 compute-0 nova_compute[189265]: 2025-09-30 07:41:08.858 2 INFO nova.scheduler.client.report [None req-8fd3a1e6-27f5-41d5-b8bb-4eec336e0523 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Deleted allocations for instance dd9afa46-ab32-4a8e-861d-d825051c267a
Sep 30 07:41:09 compute-0 nova_compute[189265]: 2025-09-30 07:41:09.353 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:41:09 compute-0 nova_compute[189265]: 2025-09-30 07:41:09.887 2 DEBUG oslo_concurrency.lockutils [None req-8fd3a1e6-27f5-41d5-b8bb-4eec336e0523 89ba5d19014145188ad2a3c812acdc88 6431607f3dce4c88bbf6d17ee6cd45b2 - - default default] Lock "dd9afa46-ab32-4a8e-861d-d825051c267a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.188s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:41:10 compute-0 nova_compute[189265]: 2025-09-30 07:41:10.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:41:14 compute-0 nova_compute[189265]: 2025-09-30 07:41:14.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:41:15 compute-0 nova_compute[189265]: 2025-09-30 07:41:15.129 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:41:15 compute-0 nova_compute[189265]: 2025-09-30 07:41:15.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:41:16 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:41:16.357 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '1a:26:7c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2e:60:fa:91:d0:34'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:41:16 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:41:16.358 100322 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 07:41:16 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:41:16.359 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=01429670-4ea1-4dab-babc-4bc628cc01bb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:41:16 compute-0 nova_compute[189265]: 2025-09-30 07:41:16.416 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:41:18 compute-0 podman[223231]: 2025-09-30 07:41:18.512190282 +0000 UTC m=+0.088095690 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 07:41:19 compute-0 nova_compute[189265]: 2025-09-30 07:41:19.357 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:41:20 compute-0 nova_compute[189265]: 2025-09-30 07:41:20.164 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:41:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:41:20.582 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:41:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:41:20.582 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:41:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:41:20.582 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:41:24 compute-0 nova_compute[189265]: 2025-09-30 07:41:24.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:41:25 compute-0 nova_compute[189265]: 2025-09-30 07:41:25.167 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:41:27 compute-0 podman[223258]: 2025-09-30 07:41:27.514572615 +0000 UTC m=+0.091341271 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, container_name=iscsid)
Sep 30 07:41:27 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:41:27.949 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:87:8a:ba 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-c37ccab9-b2b3-4600-9cd6-fc38d618b79f', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c37ccab9-b2b3-4600-9cd6-fc38d618b79f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '75afc4c4c3cd416898ef46cd7b7e99de', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3a63f614-d134-4922-9cae-18c0918b6eb4, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=62cc21a6-25dd-4a68-bbc0-05c4bab51f8a) old=Port_Binding(mac=['fa:16:3e:87:8a:ba'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-c37ccab9-b2b3-4600-9cd6-fc38d618b79f', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c37ccab9-b2b3-4600-9cd6-fc38d618b79f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '75afc4c4c3cd416898ef46cd7b7e99de', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:41:27 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:41:27.950 100322 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 62cc21a6-25dd-4a68-bbc0-05c4bab51f8a in datapath c37ccab9-b2b3-4600-9cd6-fc38d618b79f updated
Sep 30 07:41:27 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:41:27.951 100322 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c37ccab9-b2b3-4600-9cd6-fc38d618b79f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 07:41:27 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:41:27.953 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[e53b98a2-74bc-49b5-93c5-9d04c1414869]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:41:29 compute-0 nova_compute[189265]: 2025-09-30 07:41:29.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:41:29 compute-0 podman[199733]: time="2025-09-30T07:41:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:41:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:41:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:41:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:41:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3015 "" "Go-http-client/1.1"
Sep 30 07:41:30 compute-0 nova_compute[189265]: 2025-09-30 07:41:30.193 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:41:31 compute-0 openstack_network_exporter[201859]: ERROR   07:41:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:41:31 compute-0 openstack_network_exporter[201859]: ERROR   07:41:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:41:31 compute-0 openstack_network_exporter[201859]: ERROR   07:41:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:41:31 compute-0 openstack_network_exporter[201859]: ERROR   07:41:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:41:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:41:31 compute-0 openstack_network_exporter[201859]: ERROR   07:41:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:41:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:41:31 compute-0 podman[223279]: 2025-09-30 07:41:31.524782179 +0000 UTC m=+0.099725953 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.openshift.expose-services=, release=1755695350, version=9.6, io.buildah.version=1.33.7, architecture=x86_64, maintainer=Red Hat, Inc.)
Sep 30 07:41:34 compute-0 nova_compute[189265]: 2025-09-30 07:41:34.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:41:34 compute-0 podman[223301]: 2025-09-30 07:41:34.513450813 +0000 UTC m=+0.086303548 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd)
Sep 30 07:41:34 compute-0 podman[223302]: 2025-09-30 07:41:34.535565038 +0000 UTC m=+0.099369433 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Sep 30 07:41:34 compute-0 podman[223303]: 2025-09-30 07:41:34.552800593 +0000 UTC m=+0.116319610 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Sep 30 07:41:35 compute-0 nova_compute[189265]: 2025-09-30 07:41:35.195 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:41:38 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:41:38.022 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c9:13:7c 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-ffdacf34-4448-4eaf-ac17-56523d53bf96', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ffdacf34-4448-4eaf-ac17-56523d53bf96', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'eb449ca8f36d45d88d1ef08bcb192ca6', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bb611d8b-2bfc-46e6-9435-9e47c6ae3a13, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ff601981-9b04-458e-ab89-fb631cbc4536) old=Port_Binding(mac=['fa:16:3e:c9:13:7c'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-ffdacf34-4448-4eaf-ac17-56523d53bf96', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ffdacf34-4448-4eaf-ac17-56523d53bf96', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'eb449ca8f36d45d88d1ef08bcb192ca6', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:41:38 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:41:38.023 100322 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ff601981-9b04-458e-ab89-fb631cbc4536 in datapath ffdacf34-4448-4eaf-ac17-56523d53bf96 updated
Sep 30 07:41:38 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:41:38.025 100322 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ffdacf34-4448-4eaf-ac17-56523d53bf96, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 07:41:38 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:41:38.026 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[55f2e82a-e4c3-443f-b08a-125cfc3a7bce]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:41:39 compute-0 nova_compute[189265]: 2025-09-30 07:41:39.365 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:41:40 compute-0 nova_compute[189265]: 2025-09-30 07:41:40.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:41:44 compute-0 nova_compute[189265]: 2025-09-30 07:41:44.368 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:41:45 compute-0 nova_compute[189265]: 2025-09-30 07:41:45.200 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:41:48 compute-0 ovn_controller[91436]: 2025-09-30T07:41:48Z|00251|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Sep 30 07:41:49 compute-0 nova_compute[189265]: 2025-09-30 07:41:49.368 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:41:49 compute-0 podman[223363]: 2025-09-30 07:41:49.467236285 +0000 UTC m=+0.053084145 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 07:41:50 compute-0 nova_compute[189265]: 2025-09-30 07:41:50.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:41:50 compute-0 nova_compute[189265]: 2025-09-30 07:41:50.783 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:41:52 compute-0 nova_compute[189265]: 2025-09-30 07:41:52.791 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:41:52 compute-0 nova_compute[189265]: 2025-09-30 07:41:52.792 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:41:53 compute-0 nova_compute[189265]: 2025-09-30 07:41:53.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:41:53 compute-0 nova_compute[189265]: 2025-09-30 07:41:53.788 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:41:54 compute-0 nova_compute[189265]: 2025-09-30 07:41:54.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:41:55 compute-0 nova_compute[189265]: 2025-09-30 07:41:55.203 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:41:58 compute-0 podman[223389]: 2025-09-30 07:41:58.460207639 +0000 UTC m=+0.047287768 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Sep 30 07:41:59 compute-0 nova_compute[189265]: 2025-09-30 07:41:59.373 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:41:59 compute-0 podman[199733]: time="2025-09-30T07:41:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:41:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:41:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:41:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:41:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3010 "" "Go-http-client/1.1"
Sep 30 07:42:00 compute-0 nova_compute[189265]: 2025-09-30 07:42:00.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:42:01 compute-0 openstack_network_exporter[201859]: ERROR   07:42:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:42:01 compute-0 openstack_network_exporter[201859]: ERROR   07:42:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:42:01 compute-0 openstack_network_exporter[201859]: ERROR   07:42:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:42:01 compute-0 openstack_network_exporter[201859]: ERROR   07:42:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:42:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:42:01 compute-0 openstack_network_exporter[201859]: ERROR   07:42:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:42:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:42:01 compute-0 nova_compute[189265]: 2025-09-30 07:42:01.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:42:02 compute-0 nova_compute[189265]: 2025-09-30 07:42:02.301 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:42:02 compute-0 nova_compute[189265]: 2025-09-30 07:42:02.301 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:42:02 compute-0 nova_compute[189265]: 2025-09-30 07:42:02.301 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:42:02 compute-0 nova_compute[189265]: 2025-09-30 07:42:02.301 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:42:02 compute-0 podman[223410]: 2025-09-30 07:42:02.398511009 +0000 UTC m=+0.064837122 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, version=9.6, config_id=edpm, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, name=ubi9-minimal, architecture=x86_64, io.openshift.expose-services=)
Sep 30 07:42:02 compute-0 nova_compute[189265]: 2025-09-30 07:42:02.442 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:42:02 compute-0 nova_compute[189265]: 2025-09-30 07:42:02.443 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:42:02 compute-0 nova_compute[189265]: 2025-09-30 07:42:02.461 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.018s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:42:02 compute-0 nova_compute[189265]: 2025-09-30 07:42:02.462 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5857MB free_disk=73.30371475219727GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:42:02 compute-0 nova_compute[189265]: 2025-09-30 07:42:02.463 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:42:02 compute-0 nova_compute[189265]: 2025-09-30 07:42:02.463 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:42:03 compute-0 nova_compute[189265]: 2025-09-30 07:42:03.505 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:42:03 compute-0 nova_compute[189265]: 2025-09-30 07:42:03.505 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:42:02 up  1:39,  0 user,  load average: 0.30, 0.34, 0.30\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:42:03 compute-0 nova_compute[189265]: 2025-09-30 07:42:03.522 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:42:04 compute-0 nova_compute[189265]: 2025-09-30 07:42:04.028 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:42:04 compute-0 nova_compute[189265]: 2025-09-30 07:42:04.373 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:42:04 compute-0 nova_compute[189265]: 2025-09-30 07:42:04.539 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:42:04 compute-0 nova_compute[189265]: 2025-09-30 07:42:04.539 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.076s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:42:05 compute-0 nova_compute[189265]: 2025-09-30 07:42:05.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:42:05 compute-0 podman[223433]: 2025-09-30 07:42:05.462111492 +0000 UTC m=+0.048984558 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Sep 30 07:42:05 compute-0 podman[223432]: 2025-09-30 07:42:05.488710886 +0000 UTC m=+0.072115182 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 07:42:05 compute-0 nova_compute[189265]: 2025-09-30 07:42:05.539 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:42:05 compute-0 nova_compute[189265]: 2025-09-30 07:42:05.540 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:42:05 compute-0 podman[223434]: 2025-09-30 07:42:05.550922991 +0000 UTC m=+0.126231135 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 07:42:07 compute-0 nova_compute[189265]: 2025-09-30 07:42:07.789 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:42:08 compute-0 nova_compute[189265]: 2025-09-30 07:42:08.783 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:42:09 compute-0 nova_compute[189265]: 2025-09-30 07:42:09.375 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:42:10 compute-0 nova_compute[189265]: 2025-09-30 07:42:10.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:42:14 compute-0 nova_compute[189265]: 2025-09-30 07:42:14.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:42:15 compute-0 nova_compute[189265]: 2025-09-30 07:42:15.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:42:15 compute-0 unix_chkpwd[223499]: password check failed for user (root)
Sep 30 07:42:15 compute-0 sshd-session[223497]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=52.224.109.126  user=root
Sep 30 07:42:17 compute-0 sshd-session[223497]: Failed password for root from 52.224.109.126 port 57882 ssh2
Sep 30 07:42:17 compute-0 nova_compute[189265]: 2025-09-30 07:42:17.555 2 DEBUG oslo_concurrency.lockutils [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Acquiring lock "6616fa8c-6043-4809-970f-befa571a47bf" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:42:17 compute-0 nova_compute[189265]: 2025-09-30 07:42:17.555 2 DEBUG oslo_concurrency.lockutils [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Lock "6616fa8c-6043-4809-970f-befa571a47bf" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:42:17 compute-0 sshd-session[223497]: Connection closed by authenticating user root 52.224.109.126 port 57882 [preauth]
Sep 30 07:42:18 compute-0 nova_compute[189265]: 2025-09-30 07:42:18.061 2 DEBUG nova.compute.manager [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Sep 30 07:42:18 compute-0 nova_compute[189265]: 2025-09-30 07:42:18.616 2 DEBUG oslo_concurrency.lockutils [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:42:18 compute-0 nova_compute[189265]: 2025-09-30 07:42:18.616 2 DEBUG oslo_concurrency.lockutils [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:42:18 compute-0 nova_compute[189265]: 2025-09-30 07:42:18.624 2 DEBUG nova.virt.hardware [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Sep 30 07:42:18 compute-0 nova_compute[189265]: 2025-09-30 07:42:18.624 2 INFO nova.compute.claims [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Claim successful on node compute-0.ctlplane.example.com
Sep 30 07:42:19 compute-0 nova_compute[189265]: 2025-09-30 07:42:19.380 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:42:19 compute-0 nova_compute[189265]: 2025-09-30 07:42:19.682 2 DEBUG nova.compute.provider_tree [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:42:20 compute-0 nova_compute[189265]: 2025-09-30 07:42:20.193 2 DEBUG nova.scheduler.client.report [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:42:20 compute-0 nova_compute[189265]: 2025-09-30 07:42:20.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:42:20 compute-0 podman[223500]: 2025-09-30 07:42:20.525123171 +0000 UTC m=+0.097480179 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 07:42:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:42:20.583 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:42:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:42:20.584 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:42:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:42:20.584 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:42:20 compute-0 nova_compute[189265]: 2025-09-30 07:42:20.706 2 DEBUG oslo_concurrency.lockutils [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.090s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:42:20 compute-0 nova_compute[189265]: 2025-09-30 07:42:20.707 2 DEBUG nova.compute.manager [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Sep 30 07:42:21 compute-0 nova_compute[189265]: 2025-09-30 07:42:21.221 2 DEBUG nova.compute.manager [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Sep 30 07:42:21 compute-0 nova_compute[189265]: 2025-09-30 07:42:21.222 2 DEBUG nova.network.neutron [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Sep 30 07:42:21 compute-0 nova_compute[189265]: 2025-09-30 07:42:21.222 2 WARNING neutronclient.v2_0.client [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:42:21 compute-0 nova_compute[189265]: 2025-09-30 07:42:21.223 2 WARNING neutronclient.v2_0.client [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:42:21 compute-0 nova_compute[189265]: 2025-09-30 07:42:21.733 2 INFO nova.virt.libvirt.driver [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 07:42:22 compute-0 nova_compute[189265]: 2025-09-30 07:42:22.244 2 DEBUG nova.compute.manager [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Sep 30 07:42:22 compute-0 nova_compute[189265]: 2025-09-30 07:42:22.323 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:42:22 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:42:22.323 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '1a:26:7c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2e:60:fa:91:d0:34'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:42:22 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:42:22.324 100322 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 07:42:23 compute-0 nova_compute[189265]: 2025-09-30 07:42:23.088 2 DEBUG nova.network.neutron [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Successfully created port: 103df688-88b9-4fd1-98fe-1f1b8db21a1d _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Sep 30 07:42:23 compute-0 nova_compute[189265]: 2025-09-30 07:42:23.267 2 DEBUG nova.compute.manager [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Sep 30 07:42:23 compute-0 nova_compute[189265]: 2025-09-30 07:42:23.269 2 DEBUG nova.virt.libvirt.driver [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Sep 30 07:42:23 compute-0 nova_compute[189265]: 2025-09-30 07:42:23.269 2 INFO nova.virt.libvirt.driver [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Creating image(s)
Sep 30 07:42:23 compute-0 nova_compute[189265]: 2025-09-30 07:42:23.270 2 DEBUG oslo_concurrency.lockutils [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Acquiring lock "/var/lib/nova/instances/6616fa8c-6043-4809-970f-befa571a47bf/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:42:23 compute-0 nova_compute[189265]: 2025-09-30 07:42:23.270 2 DEBUG oslo_concurrency.lockutils [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Lock "/var/lib/nova/instances/6616fa8c-6043-4809-970f-befa571a47bf/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:42:23 compute-0 nova_compute[189265]: 2025-09-30 07:42:23.271 2 DEBUG oslo_concurrency.lockutils [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Lock "/var/lib/nova/instances/6616fa8c-6043-4809-970f-befa571a47bf/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:42:23 compute-0 nova_compute[189265]: 2025-09-30 07:42:23.271 2 DEBUG oslo_utils.imageutils.format_inspector [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:42:23 compute-0 nova_compute[189265]: 2025-09-30 07:42:23.275 2 DEBUG oslo_utils.imageutils.format_inspector [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:42:23 compute-0 nova_compute[189265]: 2025-09-30 07:42:23.277 2 DEBUG oslo_concurrency.processutils [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:42:23 compute-0 nova_compute[189265]: 2025-09-30 07:42:23.366 2 DEBUG oslo_concurrency.processutils [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:42:23 compute-0 nova_compute[189265]: 2025-09-30 07:42:23.367 2 DEBUG oslo_concurrency.lockutils [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Acquiring lock "649c128805005f3dfb5a93843c58a367cdfe939d" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:42:23 compute-0 nova_compute[189265]: 2025-09-30 07:42:23.368 2 DEBUG oslo_concurrency.lockutils [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Lock "649c128805005f3dfb5a93843c58a367cdfe939d" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:42:23 compute-0 nova_compute[189265]: 2025-09-30 07:42:23.368 2 DEBUG oslo_utils.imageutils.format_inspector [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:42:23 compute-0 nova_compute[189265]: 2025-09-30 07:42:23.372 2 DEBUG oslo_utils.imageutils.format_inspector [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:42:23 compute-0 nova_compute[189265]: 2025-09-30 07:42:23.372 2 DEBUG oslo_concurrency.processutils [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:42:23 compute-0 nova_compute[189265]: 2025-09-30 07:42:23.420 2 DEBUG oslo_concurrency.processutils [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:42:23 compute-0 nova_compute[189265]: 2025-09-30 07:42:23.421 2 DEBUG oslo_concurrency.processutils [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d,backing_fmt=raw /var/lib/nova/instances/6616fa8c-6043-4809-970f-befa571a47bf/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:42:23 compute-0 nova_compute[189265]: 2025-09-30 07:42:23.461 2 DEBUG oslo_concurrency.processutils [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d,backing_fmt=raw /var/lib/nova/instances/6616fa8c-6043-4809-970f-befa571a47bf/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:42:23 compute-0 nova_compute[189265]: 2025-09-30 07:42:23.462 2 DEBUG oslo_concurrency.lockutils [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Lock "649c128805005f3dfb5a93843c58a367cdfe939d" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.094s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:42:23 compute-0 nova_compute[189265]: 2025-09-30 07:42:23.462 2 DEBUG oslo_concurrency.processutils [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:42:23 compute-0 nova_compute[189265]: 2025-09-30 07:42:23.511 2 DEBUG oslo_concurrency.processutils [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:42:23 compute-0 nova_compute[189265]: 2025-09-30 07:42:23.513 2 DEBUG nova.virt.disk.api [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Checking if we can resize image /var/lib/nova/instances/6616fa8c-6043-4809-970f-befa571a47bf/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Sep 30 07:42:23 compute-0 nova_compute[189265]: 2025-09-30 07:42:23.513 2 DEBUG oslo_concurrency.processutils [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6616fa8c-6043-4809-970f-befa571a47bf/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:42:23 compute-0 nova_compute[189265]: 2025-09-30 07:42:23.585 2 DEBUG oslo_concurrency.processutils [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6616fa8c-6043-4809-970f-befa571a47bf/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:42:23 compute-0 nova_compute[189265]: 2025-09-30 07:42:23.586 2 DEBUG nova.virt.disk.api [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Cannot resize image /var/lib/nova/instances/6616fa8c-6043-4809-970f-befa571a47bf/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Sep 30 07:42:23 compute-0 nova_compute[189265]: 2025-09-30 07:42:23.587 2 DEBUG nova.virt.libvirt.driver [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Sep 30 07:42:23 compute-0 nova_compute[189265]: 2025-09-30 07:42:23.588 2 DEBUG nova.virt.libvirt.driver [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Ensure instance console log exists: /var/lib/nova/instances/6616fa8c-6043-4809-970f-befa571a47bf/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 07:42:23 compute-0 nova_compute[189265]: 2025-09-30 07:42:23.588 2 DEBUG oslo_concurrency.lockutils [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:42:23 compute-0 nova_compute[189265]: 2025-09-30 07:42:23.589 2 DEBUG oslo_concurrency.lockutils [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:42:23 compute-0 nova_compute[189265]: 2025-09-30 07:42:23.589 2 DEBUG oslo_concurrency.lockutils [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:42:24 compute-0 nova_compute[189265]: 2025-09-30 07:42:24.355 2 DEBUG nova.network.neutron [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Successfully updated port: 103df688-88b9-4fd1-98fe-1f1b8db21a1d _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Sep 30 07:42:24 compute-0 nova_compute[189265]: 2025-09-30 07:42:24.382 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:42:24 compute-0 nova_compute[189265]: 2025-09-30 07:42:24.425 2 DEBUG nova.compute.manager [req-c694e350-edc6-4d2a-8774-ef45d7ebb526 req-83ce9ed3-1385-4ced-a492-a6b2cd93194c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Received event network-changed-103df688-88b9-4fd1-98fe-1f1b8db21a1d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:42:24 compute-0 nova_compute[189265]: 2025-09-30 07:42:24.425 2 DEBUG nova.compute.manager [req-c694e350-edc6-4d2a-8774-ef45d7ebb526 req-83ce9ed3-1385-4ced-a492-a6b2cd93194c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Refreshing instance network info cache due to event network-changed-103df688-88b9-4fd1-98fe-1f1b8db21a1d. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 07:42:24 compute-0 nova_compute[189265]: 2025-09-30 07:42:24.425 2 DEBUG oslo_concurrency.lockutils [req-c694e350-edc6-4d2a-8774-ef45d7ebb526 req-83ce9ed3-1385-4ced-a492-a6b2cd93194c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "refresh_cache-6616fa8c-6043-4809-970f-befa571a47bf" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:42:24 compute-0 nova_compute[189265]: 2025-09-30 07:42:24.425 2 DEBUG oslo_concurrency.lockutils [req-c694e350-edc6-4d2a-8774-ef45d7ebb526 req-83ce9ed3-1385-4ced-a492-a6b2cd93194c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquired lock "refresh_cache-6616fa8c-6043-4809-970f-befa571a47bf" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:42:24 compute-0 nova_compute[189265]: 2025-09-30 07:42:24.426 2 DEBUG nova.network.neutron [req-c694e350-edc6-4d2a-8774-ef45d7ebb526 req-83ce9ed3-1385-4ced-a492-a6b2cd93194c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Refreshing network info cache for port 103df688-88b9-4fd1-98fe-1f1b8db21a1d _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 07:42:24 compute-0 nova_compute[189265]: 2025-09-30 07:42:24.862 2 DEBUG oslo_concurrency.lockutils [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Acquiring lock "refresh_cache-6616fa8c-6043-4809-970f-befa571a47bf" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:42:24 compute-0 nova_compute[189265]: 2025-09-30 07:42:24.932 2 WARNING neutronclient.v2_0.client [req-c694e350-edc6-4d2a-8774-ef45d7ebb526 req-83ce9ed3-1385-4ced-a492-a6b2cd93194c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:42:25 compute-0 nova_compute[189265]: 2025-09-30 07:42:25.275 2 DEBUG nova.network.neutron [req-c694e350-edc6-4d2a-8774-ef45d7ebb526 req-83ce9ed3-1385-4ced-a492-a6b2cd93194c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 07:42:25 compute-0 nova_compute[189265]: 2025-09-30 07:42:25.319 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:42:25 compute-0 nova_compute[189265]: 2025-09-30 07:42:25.416 2 DEBUG nova.network.neutron [req-c694e350-edc6-4d2a-8774-ef45d7ebb526 req-83ce9ed3-1385-4ced-a492-a6b2cd93194c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:42:25 compute-0 nova_compute[189265]: 2025-09-30 07:42:25.961 2 DEBUG oslo_concurrency.lockutils [req-c694e350-edc6-4d2a-8774-ef45d7ebb526 req-83ce9ed3-1385-4ced-a492-a6b2cd93194c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Releasing lock "refresh_cache-6616fa8c-6043-4809-970f-befa571a47bf" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:42:25 compute-0 nova_compute[189265]: 2025-09-30 07:42:25.962 2 DEBUG oslo_concurrency.lockutils [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Acquired lock "refresh_cache-6616fa8c-6043-4809-970f-befa571a47bf" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:42:25 compute-0 nova_compute[189265]: 2025-09-30 07:42:25.963 2 DEBUG nova.network.neutron [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 07:42:27 compute-0 nova_compute[189265]: 2025-09-30 07:42:27.009 2 DEBUG nova.network.neutron [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 07:42:27 compute-0 nova_compute[189265]: 2025-09-30 07:42:27.376 2 WARNING neutronclient.v2_0.client [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:42:27 compute-0 nova_compute[189265]: 2025-09-30 07:42:27.563 2 DEBUG nova.network.neutron [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Updating instance_info_cache with network_info: [{"id": "103df688-88b9-4fd1-98fe-1f1b8db21a1d", "address": "fa:16:3e:b7:a8:8b", "network": {"id": "c37ccab9-b2b3-4600-9cd6-fc38d618b79f", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1008995657-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75afc4c4c3cd416898ef46cd7b7e99de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap103df688-88", "ovs_interfaceid": "103df688-88b9-4fd1-98fe-1f1b8db21a1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:42:28 compute-0 nova_compute[189265]: 2025-09-30 07:42:28.083 2 DEBUG oslo_concurrency.lockutils [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Releasing lock "refresh_cache-6616fa8c-6043-4809-970f-befa571a47bf" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:42:28 compute-0 nova_compute[189265]: 2025-09-30 07:42:28.084 2 DEBUG nova.compute.manager [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Instance network_info: |[{"id": "103df688-88b9-4fd1-98fe-1f1b8db21a1d", "address": "fa:16:3e:b7:a8:8b", "network": {"id": "c37ccab9-b2b3-4600-9cd6-fc38d618b79f", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1008995657-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75afc4c4c3cd416898ef46cd7b7e99de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap103df688-88", "ovs_interfaceid": "103df688-88b9-4fd1-98fe-1f1b8db21a1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Sep 30 07:42:28 compute-0 nova_compute[189265]: 2025-09-30 07:42:28.089 2 DEBUG nova.virt.libvirt.driver [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Start _get_guest_xml network_info=[{"id": "103df688-88b9-4fd1-98fe-1f1b8db21a1d", "address": "fa:16:3e:b7:a8:8b", "network": {"id": "c37ccab9-b2b3-4600-9cd6-fc38d618b79f", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1008995657-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75afc4c4c3cd416898ef46cd7b7e99de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap103df688-88", "ovs_interfaceid": "103df688-88b9-4fd1-98fe-1f1b8db21a1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T07:07:59Z,direct_url=<?>,disk_format='qcow2',id=0c6b92f5-9861-49e4-862d-3ffd84520dfa,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4049964ce8244dacb50493f6676c6613',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T07:08:00Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'encryption_format': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encrypted': False, 'image_id': '0c6b92f5-9861-49e4-862d-3ffd84520dfa'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Sep 30 07:42:28 compute-0 nova_compute[189265]: 2025-09-30 07:42:28.095 2 WARNING nova.virt.libvirt.driver [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:42:28 compute-0 nova_compute[189265]: 2025-09-30 07:42:28.097 2 DEBUG nova.virt.driver [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='0c6b92f5-9861-49e4-862d-3ffd84520dfa', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-345872352', uuid='6616fa8c-6043-4809-970f-befa571a47bf'), owner=OwnerMeta(userid='5c02a0a41ab14f6a92e1e6e2798736ae', username='tempest-TestExecuteVmWorkloadBalanceStrategy-264644006-project-admin', projectid='eb449ca8f36d45d88d1ef08bcb192ca6', projectname='tempest-TestExecuteVmWorkloadBalanceStrategy-264644006'), image=ImageMeta(id='0c6b92f5-9861-49e4-862d-3ffd84520dfa', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='ded17455-f8fe-40c7-8dae-6f0a2b208ae0', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "103df688-88b9-4fd1-98fe-1f1b8db21a1d", "address": "fa:16:3e:b7:a8:8b", "network": {"id": "c37ccab9-b2b3-4600-9cd6-fc38d618b79f", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1008995657-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75afc4c4c3cd416898ef46cd7b7e99de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap103df688-88", "ovs_interfaceid": "103df688-88b9-4fd1-98fe-1f1b8db21a1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759218148.0970814) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Sep 30 07:42:28 compute-0 nova_compute[189265]: 2025-09-30 07:42:28.103 2 DEBUG nova.virt.libvirt.host [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Sep 30 07:42:28 compute-0 nova_compute[189265]: 2025-09-30 07:42:28.104 2 DEBUG nova.virt.libvirt.host [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Sep 30 07:42:28 compute-0 nova_compute[189265]: 2025-09-30 07:42:28.108 2 DEBUG nova.virt.libvirt.host [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Sep 30 07:42:28 compute-0 nova_compute[189265]: 2025-09-30 07:42:28.109 2 DEBUG nova.virt.libvirt.host [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Sep 30 07:42:28 compute-0 nova_compute[189265]: 2025-09-30 07:42:28.110 2 DEBUG nova.virt.libvirt.driver [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 07:42:28 compute-0 nova_compute[189265]: 2025-09-30 07:42:28.110 2 DEBUG nova.virt.hardware [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T07:07:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='ded17455-f8fe-40c7-8dae-6f0a2b208ae0',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T07:07:59Z,direct_url=<?>,disk_format='qcow2',id=0c6b92f5-9861-49e4-862d-3ffd84520dfa,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4049964ce8244dacb50493f6676c6613',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T07:08:00Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Sep 30 07:42:28 compute-0 nova_compute[189265]: 2025-09-30 07:42:28.111 2 DEBUG nova.virt.hardware [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Sep 30 07:42:28 compute-0 nova_compute[189265]: 2025-09-30 07:42:28.111 2 DEBUG nova.virt.hardware [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Sep 30 07:42:28 compute-0 nova_compute[189265]: 2025-09-30 07:42:28.112 2 DEBUG nova.virt.hardware [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Sep 30 07:42:28 compute-0 nova_compute[189265]: 2025-09-30 07:42:28.112 2 DEBUG nova.virt.hardware [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Sep 30 07:42:28 compute-0 nova_compute[189265]: 2025-09-30 07:42:28.113 2 DEBUG nova.virt.hardware [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Sep 30 07:42:28 compute-0 nova_compute[189265]: 2025-09-30 07:42:28.113 2 DEBUG nova.virt.hardware [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Sep 30 07:42:28 compute-0 nova_compute[189265]: 2025-09-30 07:42:28.114 2 DEBUG nova.virt.hardware [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Sep 30 07:42:28 compute-0 nova_compute[189265]: 2025-09-30 07:42:28.114 2 DEBUG nova.virt.hardware [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Sep 30 07:42:28 compute-0 nova_compute[189265]: 2025-09-30 07:42:28.114 2 DEBUG nova.virt.hardware [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Sep 30 07:42:28 compute-0 nova_compute[189265]: 2025-09-30 07:42:28.115 2 DEBUG nova.virt.hardware [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Sep 30 07:42:28 compute-0 nova_compute[189265]: 2025-09-30 07:42:28.122 2 DEBUG nova.virt.libvirt.vif [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T07:42:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-345872352',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-345872352',id=27,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='eb449ca8f36d45d88d1ef08bcb192ca6',ramdisk_id='',reservation_id='r-s4id5kua',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-264644006',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-264644006-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T07:42:22Z,user_data=None,user_id='5c02a0a41ab14f6a92e1e6e2798736ae',uuid=6616fa8c-6043-4809-970f-befa571a47bf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "103df688-88b9-4fd1-98fe-1f1b8db21a1d", "address": "fa:16:3e:b7:a8:8b", "network": {"id": "c37ccab9-b2b3-4600-9cd6-fc38d618b79f", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1008995657-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75afc4c4c3cd416898ef46cd7b7e99de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap103df688-88", "ovs_interfaceid": "103df688-88b9-4fd1-98fe-1f1b8db21a1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 07:42:28 compute-0 nova_compute[189265]: 2025-09-30 07:42:28.122 2 DEBUG nova.network.os_vif_util [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Converting VIF {"id": "103df688-88b9-4fd1-98fe-1f1b8db21a1d", "address": "fa:16:3e:b7:a8:8b", "network": {"id": "c37ccab9-b2b3-4600-9cd6-fc38d618b79f", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1008995657-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75afc4c4c3cd416898ef46cd7b7e99de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap103df688-88", "ovs_interfaceid": "103df688-88b9-4fd1-98fe-1f1b8db21a1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:42:28 compute-0 nova_compute[189265]: 2025-09-30 07:42:28.124 2 DEBUG nova.network.os_vif_util [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b7:a8:8b,bridge_name='br-int',has_traffic_filtering=True,id=103df688-88b9-4fd1-98fe-1f1b8db21a1d,network=Network(c37ccab9-b2b3-4600-9cd6-fc38d618b79f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap103df688-88') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:42:28 compute-0 nova_compute[189265]: 2025-09-30 07:42:28.125 2 DEBUG nova.objects.instance [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6616fa8c-6043-4809-970f-befa571a47bf obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:42:28 compute-0 nova_compute[189265]: 2025-09-30 07:42:28.720 2 DEBUG nova.virt.libvirt.driver [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] End _get_guest_xml xml=<domain type="kvm">
Sep 30 07:42:28 compute-0 nova_compute[189265]:   <uuid>6616fa8c-6043-4809-970f-befa571a47bf</uuid>
Sep 30 07:42:28 compute-0 nova_compute[189265]:   <name>instance-0000001b</name>
Sep 30 07:42:28 compute-0 nova_compute[189265]:   <memory>131072</memory>
Sep 30 07:42:28 compute-0 nova_compute[189265]:   <vcpu>1</vcpu>
Sep 30 07:42:28 compute-0 nova_compute[189265]:   <metadata>
Sep 30 07:42:28 compute-0 nova_compute[189265]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 07:42:28 compute-0 nova_compute[189265]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:       <nova:name>tempest-TestExecuteVmWorkloadBalanceStrategy-server-345872352</nova:name>
Sep 30 07:42:28 compute-0 nova_compute[189265]:       <nova:creationTime>2025-09-30 07:42:28</nova:creationTime>
Sep 30 07:42:28 compute-0 nova_compute[189265]:       <nova:flavor name="m1.nano" id="ded17455-f8fe-40c7-8dae-6f0a2b208ae0">
Sep 30 07:42:28 compute-0 nova_compute[189265]:         <nova:memory>128</nova:memory>
Sep 30 07:42:28 compute-0 nova_compute[189265]:         <nova:disk>1</nova:disk>
Sep 30 07:42:28 compute-0 nova_compute[189265]:         <nova:swap>0</nova:swap>
Sep 30 07:42:28 compute-0 nova_compute[189265]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 07:42:28 compute-0 nova_compute[189265]:         <nova:vcpus>1</nova:vcpus>
Sep 30 07:42:28 compute-0 nova_compute[189265]:         <nova:extraSpecs>
Sep 30 07:42:28 compute-0 nova_compute[189265]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 07:42:28 compute-0 nova_compute[189265]:         </nova:extraSpecs>
Sep 30 07:42:28 compute-0 nova_compute[189265]:       </nova:flavor>
Sep 30 07:42:28 compute-0 nova_compute[189265]:       <nova:image uuid="0c6b92f5-9861-49e4-862d-3ffd84520dfa">
Sep 30 07:42:28 compute-0 nova_compute[189265]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 07:42:28 compute-0 nova_compute[189265]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 07:42:28 compute-0 nova_compute[189265]:         <nova:minDisk>1</nova:minDisk>
Sep 30 07:42:28 compute-0 nova_compute[189265]:         <nova:minRam>0</nova:minRam>
Sep 30 07:42:28 compute-0 nova_compute[189265]:         <nova:properties>
Sep 30 07:42:28 compute-0 nova_compute[189265]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 07:42:28 compute-0 nova_compute[189265]:         </nova:properties>
Sep 30 07:42:28 compute-0 nova_compute[189265]:       </nova:image>
Sep 30 07:42:28 compute-0 nova_compute[189265]:       <nova:owner>
Sep 30 07:42:28 compute-0 nova_compute[189265]:         <nova:user uuid="5c02a0a41ab14f6a92e1e6e2798736ae">tempest-TestExecuteVmWorkloadBalanceStrategy-264644006-project-admin</nova:user>
Sep 30 07:42:28 compute-0 nova_compute[189265]:         <nova:project uuid="eb449ca8f36d45d88d1ef08bcb192ca6">tempest-TestExecuteVmWorkloadBalanceStrategy-264644006</nova:project>
Sep 30 07:42:28 compute-0 nova_compute[189265]:       </nova:owner>
Sep 30 07:42:28 compute-0 nova_compute[189265]:       <nova:root type="image" uuid="0c6b92f5-9861-49e4-862d-3ffd84520dfa"/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:       <nova:ports>
Sep 30 07:42:28 compute-0 nova_compute[189265]:         <nova:port uuid="103df688-88b9-4fd1-98fe-1f1b8db21a1d">
Sep 30 07:42:28 compute-0 nova_compute[189265]:           <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:         </nova:port>
Sep 30 07:42:28 compute-0 nova_compute[189265]:       </nova:ports>
Sep 30 07:42:28 compute-0 nova_compute[189265]:     </nova:instance>
Sep 30 07:42:28 compute-0 nova_compute[189265]:   </metadata>
Sep 30 07:42:28 compute-0 nova_compute[189265]:   <sysinfo type="smbios">
Sep 30 07:42:28 compute-0 nova_compute[189265]:     <system>
Sep 30 07:42:28 compute-0 nova_compute[189265]:       <entry name="manufacturer">RDO</entry>
Sep 30 07:42:28 compute-0 nova_compute[189265]:       <entry name="product">OpenStack Compute</entry>
Sep 30 07:42:28 compute-0 nova_compute[189265]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 07:42:28 compute-0 nova_compute[189265]:       <entry name="serial">6616fa8c-6043-4809-970f-befa571a47bf</entry>
Sep 30 07:42:28 compute-0 nova_compute[189265]:       <entry name="uuid">6616fa8c-6043-4809-970f-befa571a47bf</entry>
Sep 30 07:42:28 compute-0 nova_compute[189265]:       <entry name="family">Virtual Machine</entry>
Sep 30 07:42:28 compute-0 nova_compute[189265]:     </system>
Sep 30 07:42:28 compute-0 nova_compute[189265]:   </sysinfo>
Sep 30 07:42:28 compute-0 nova_compute[189265]:   <os>
Sep 30 07:42:28 compute-0 nova_compute[189265]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 07:42:28 compute-0 nova_compute[189265]:     <boot dev="hd"/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:     <smbios mode="sysinfo"/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:   </os>
Sep 30 07:42:28 compute-0 nova_compute[189265]:   <features>
Sep 30 07:42:28 compute-0 nova_compute[189265]:     <acpi/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:     <apic/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:     <vmcoreinfo/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:   </features>
Sep 30 07:42:28 compute-0 nova_compute[189265]:   <clock offset="utc">
Sep 30 07:42:28 compute-0 nova_compute[189265]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:     <timer name="hpet" present="no"/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:   </clock>
Sep 30 07:42:28 compute-0 nova_compute[189265]:   <cpu mode="host-model" match="exact">
Sep 30 07:42:28 compute-0 nova_compute[189265]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:   </cpu>
Sep 30 07:42:28 compute-0 nova_compute[189265]:   <devices>
Sep 30 07:42:28 compute-0 nova_compute[189265]:     <disk type="file" device="disk">
Sep 30 07:42:28 compute-0 nova_compute[189265]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:       <source file="/var/lib/nova/instances/6616fa8c-6043-4809-970f-befa571a47bf/disk"/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:       <target dev="vda" bus="virtio"/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:     </disk>
Sep 30 07:42:28 compute-0 nova_compute[189265]:     <disk type="file" device="cdrom">
Sep 30 07:42:28 compute-0 nova_compute[189265]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:       <source file="/var/lib/nova/instances/6616fa8c-6043-4809-970f-befa571a47bf/disk.config"/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:       <target dev="sda" bus="sata"/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:     </disk>
Sep 30 07:42:28 compute-0 nova_compute[189265]:     <interface type="ethernet">
Sep 30 07:42:28 compute-0 nova_compute[189265]:       <mac address="fa:16:3e:b7:a8:8b"/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:       <model type="virtio"/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:       <mtu size="1442"/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:       <target dev="tap103df688-88"/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:     </interface>
Sep 30 07:42:28 compute-0 nova_compute[189265]:     <serial type="pty">
Sep 30 07:42:28 compute-0 nova_compute[189265]:       <log file="/var/lib/nova/instances/6616fa8c-6043-4809-970f-befa571a47bf/console.log" append="off"/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:     </serial>
Sep 30 07:42:28 compute-0 nova_compute[189265]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:     <video>
Sep 30 07:42:28 compute-0 nova_compute[189265]:       <model type="virtio"/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:     </video>
Sep 30 07:42:28 compute-0 nova_compute[189265]:     <input type="tablet" bus="usb"/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:     <rng model="virtio">
Sep 30 07:42:28 compute-0 nova_compute[189265]:       <backend model="random">/dev/urandom</backend>
Sep 30 07:42:28 compute-0 nova_compute[189265]:     </rng>
Sep 30 07:42:28 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root"/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:     <controller type="usb" index="0"/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 07:42:28 compute-0 nova_compute[189265]:       <stats period="10"/>
Sep 30 07:42:28 compute-0 nova_compute[189265]:     </memballoon>
Sep 30 07:42:28 compute-0 nova_compute[189265]:   </devices>
Sep 30 07:42:28 compute-0 nova_compute[189265]: </domain>
Sep 30 07:42:28 compute-0 nova_compute[189265]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Sep 30 07:42:28 compute-0 nova_compute[189265]: 2025-09-30 07:42:28.721 2 DEBUG nova.compute.manager [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Preparing to wait for external event network-vif-plugged-103df688-88b9-4fd1-98fe-1f1b8db21a1d prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 07:42:28 compute-0 nova_compute[189265]: 2025-09-30 07:42:28.722 2 DEBUG oslo_concurrency.lockutils [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Acquiring lock "6616fa8c-6043-4809-970f-befa571a47bf-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:42:28 compute-0 nova_compute[189265]: 2025-09-30 07:42:28.722 2 DEBUG oslo_concurrency.lockutils [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Lock "6616fa8c-6043-4809-970f-befa571a47bf-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:42:28 compute-0 nova_compute[189265]: 2025-09-30 07:42:28.722 2 DEBUG oslo_concurrency.lockutils [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Lock "6616fa8c-6043-4809-970f-befa571a47bf-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:42:28 compute-0 nova_compute[189265]: 2025-09-30 07:42:28.723 2 DEBUG nova.virt.libvirt.vif [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T07:42:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-345872352',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-345872352',id=27,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='eb449ca8f36d45d88d1ef08bcb192ca6',ramdisk_id='',reservation_id='r-s4id5kua',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-264644006',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-264644006-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T07:42:22Z,user_data=None,user_id='5c02a0a41ab14f6a92e1e6e2798736ae',uuid=6616fa8c-6043-4809-970f-befa571a47bf,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "103df688-88b9-4fd1-98fe-1f1b8db21a1d", "address": "fa:16:3e:b7:a8:8b", "network": {"id": "c37ccab9-b2b3-4600-9cd6-fc38d618b79f", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1008995657-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75afc4c4c3cd416898ef46cd7b7e99de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap103df688-88", "ovs_interfaceid": "103df688-88b9-4fd1-98fe-1f1b8db21a1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 07:42:28 compute-0 nova_compute[189265]: 2025-09-30 07:42:28.723 2 DEBUG nova.network.os_vif_util [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Converting VIF {"id": "103df688-88b9-4fd1-98fe-1f1b8db21a1d", "address": "fa:16:3e:b7:a8:8b", "network": {"id": "c37ccab9-b2b3-4600-9cd6-fc38d618b79f", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1008995657-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75afc4c4c3cd416898ef46cd7b7e99de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap103df688-88", "ovs_interfaceid": "103df688-88b9-4fd1-98fe-1f1b8db21a1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:42:28 compute-0 nova_compute[189265]: 2025-09-30 07:42:28.724 2 DEBUG nova.network.os_vif_util [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b7:a8:8b,bridge_name='br-int',has_traffic_filtering=True,id=103df688-88b9-4fd1-98fe-1f1b8db21a1d,network=Network(c37ccab9-b2b3-4600-9cd6-fc38d618b79f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap103df688-88') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:42:28 compute-0 nova_compute[189265]: 2025-09-30 07:42:28.724 2 DEBUG os_vif [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b7:a8:8b,bridge_name='br-int',has_traffic_filtering=True,id=103df688-88b9-4fd1-98fe-1f1b8db21a1d,network=Network(c37ccab9-b2b3-4600-9cd6-fc38d618b79f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap103df688-88') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 07:42:28 compute-0 nova_compute[189265]: 2025-09-30 07:42:28.725 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:42:28 compute-0 nova_compute[189265]: 2025-09-30 07:42:28.725 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:42:28 compute-0 nova_compute[189265]: 2025-09-30 07:42:28.726 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:42:28 compute-0 nova_compute[189265]: 2025-09-30 07:42:28.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:42:28 compute-0 nova_compute[189265]: 2025-09-30 07:42:28.727 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '8cb090b4-c558-54d8-bcf2-707d8fad101d', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:42:28 compute-0 nova_compute[189265]: 2025-09-30 07:42:28.728 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:42:28 compute-0 nova_compute[189265]: 2025-09-30 07:42:28.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:42:28 compute-0 nova_compute[189265]: 2025-09-30 07:42:28.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:42:28 compute-0 nova_compute[189265]: 2025-09-30 07:42:28.732 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:42:28 compute-0 nova_compute[189265]: 2025-09-30 07:42:28.734 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:42:28 compute-0 nova_compute[189265]: 2025-09-30 07:42:28.734 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap103df688-88, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:42:28 compute-0 nova_compute[189265]: 2025-09-30 07:42:28.734 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap103df688-88, col_values=(('qos', UUID('0ee2f6f4-7ad2-4f81-b13f-c63c35d618da')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:42:28 compute-0 nova_compute[189265]: 2025-09-30 07:42:28.735 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap103df688-88, col_values=(('external_ids', {'iface-id': '103df688-88b9-4fd1-98fe-1f1b8db21a1d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b7:a8:8b', 'vm-uuid': '6616fa8c-6043-4809-970f-befa571a47bf'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:42:28 compute-0 NetworkManager[51813]: <info>  [1759218148.7371] manager: (tap103df688-88): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/87)
Sep 30 07:42:28 compute-0 nova_compute[189265]: 2025-09-30 07:42:28.736 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:42:28 compute-0 nova_compute[189265]: 2025-09-30 07:42:28.738 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:42:28 compute-0 nova_compute[189265]: 2025-09-30 07:42:28.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:42:28 compute-0 nova_compute[189265]: 2025-09-30 07:42:28.744 2 INFO os_vif [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b7:a8:8b,bridge_name='br-int',has_traffic_filtering=True,id=103df688-88b9-4fd1-98fe-1f1b8db21a1d,network=Network(c37ccab9-b2b3-4600-9cd6-fc38d618b79f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap103df688-88')
Sep 30 07:42:29 compute-0 nova_compute[189265]: 2025-09-30 07:42:29.385 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:42:29 compute-0 podman[223544]: 2025-09-30 07:42:29.488263357 +0000 UTC m=+0.068492817 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid)
Sep 30 07:42:29 compute-0 podman[199733]: time="2025-09-30T07:42:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:42:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:42:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:42:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:42:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3017 "" "Go-http-client/1.1"
Sep 30 07:42:30 compute-0 nova_compute[189265]: 2025-09-30 07:42:30.339 2 DEBUG nova.virt.libvirt.driver [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 07:42:30 compute-0 nova_compute[189265]: 2025-09-30 07:42:30.339 2 DEBUG nova.virt.libvirt.driver [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 07:42:30 compute-0 nova_compute[189265]: 2025-09-30 07:42:30.340 2 DEBUG nova.virt.libvirt.driver [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] No VIF found with MAC fa:16:3e:b7:a8:8b, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 07:42:30 compute-0 nova_compute[189265]: 2025-09-30 07:42:30.341 2 INFO nova.virt.libvirt.driver [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Using config drive
Sep 30 07:42:30 compute-0 nova_compute[189265]: 2025-09-30 07:42:30.850 2 WARNING neutronclient.v2_0.client [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:42:31 compute-0 nova_compute[189265]: 2025-09-30 07:42:31.101 2 INFO nova.virt.libvirt.driver [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Creating config drive at /var/lib/nova/instances/6616fa8c-6043-4809-970f-befa571a47bf/disk.config
Sep 30 07:42:31 compute-0 nova_compute[189265]: 2025-09-30 07:42:31.111 2 DEBUG oslo_concurrency.processutils [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6616fa8c-6043-4809-970f-befa571a47bf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpbzeak2tb execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:42:31 compute-0 nova_compute[189265]: 2025-09-30 07:42:31.256 2 DEBUG oslo_concurrency.processutils [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6616fa8c-6043-4809-970f-befa571a47bf/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpbzeak2tb" returned: 0 in 0.145s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:42:31.326 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=01429670-4ea1-4dab-babc-4bc628cc01bb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:42:31 compute-0 kernel: tap103df688-88: entered promiscuous mode
Sep 30 07:42:31 compute-0 NetworkManager[51813]: <info>  [1759218151.3287] manager: (tap103df688-88): new Tun device (/org/freedesktop/NetworkManager/Devices/88)
Sep 30 07:42:31 compute-0 ovn_controller[91436]: 2025-09-30T07:42:31Z|00252|binding|INFO|Claiming lport 103df688-88b9-4fd1-98fe-1f1b8db21a1d for this chassis.
Sep 30 07:42:31 compute-0 ovn_controller[91436]: 2025-09-30T07:42:31Z|00253|binding|INFO|103df688-88b9-4fd1-98fe-1f1b8db21a1d: Claiming fa:16:3e:b7:a8:8b 10.100.0.5
Sep 30 07:42:31 compute-0 nova_compute[189265]: 2025-09-30 07:42:31.330 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:42:31 compute-0 nova_compute[189265]: 2025-09-30 07:42:31.337 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:42:31 compute-0 systemd-udevd[223580]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:42:31.353 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b7:a8:8b 10.100.0.5'], port_security=['fa:16:3e:b7:a8:8b 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '6616fa8c-6043-4809-970f-befa571a47bf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c37ccab9-b2b3-4600-9cd6-fc38d618b79f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'eb449ca8f36d45d88d1ef08bcb192ca6', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd8d8aea8-dfd7-4028-b831-7a1c1bc3f21e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3a63f614-d134-4922-9cae-18c0918b6eb4, chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], logical_port=103df688-88b9-4fd1-98fe-1f1b8db21a1d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:42:31.354 100322 INFO neutron.agent.ovn.metadata.agent [-] Port 103df688-88b9-4fd1-98fe-1f1b8db21a1d in datapath c37ccab9-b2b3-4600-9cd6-fc38d618b79f bound to our chassis
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:42:31.355 100322 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c37ccab9-b2b3-4600-9cd6-fc38d618b79f
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:42:31.365 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[0ddc1c07-836d-40ef-8019-97b7da206202]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:42:31.366 100322 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc37ccab9-b1 in ovnmeta-c37ccab9-b2b3-4600-9cd6-fc38d618b79f namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:42:31.367 210650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc37ccab9-b0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:42:31.367 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[4aa63d09-5a4c-46ae-b3dc-7e689e35577f]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:42:31.368 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[29e8a927-756b-4a24-b016-e2957ad73965]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:42:31 compute-0 NetworkManager[51813]: <info>  [1759218151.3736] device (tap103df688-88): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 07:42:31 compute-0 NetworkManager[51813]: <info>  [1759218151.3761] device (tap103df688-88): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:42:31.378 100440 DEBUG oslo.privsep.daemon [-] privsep: reply[ba877454-2038-4f99-bcb2-4fc71da61495]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:42:31 compute-0 systemd-machined[149233]: New machine qemu-21-instance-0000001b.
Sep 30 07:42:31 compute-0 nova_compute[189265]: 2025-09-30 07:42:31.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:42:31.405 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[0c23c000-5d1d-4b6f-8581-451bdaa1f641]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:42:31 compute-0 systemd[1]: Started Virtual Machine qemu-21-instance-0000001b.
Sep 30 07:42:31 compute-0 ovn_controller[91436]: 2025-09-30T07:42:31Z|00254|binding|INFO|Setting lport 103df688-88b9-4fd1-98fe-1f1b8db21a1d ovn-installed in OVS
Sep 30 07:42:31 compute-0 ovn_controller[91436]: 2025-09-30T07:42:31Z|00255|binding|INFO|Setting lport 103df688-88b9-4fd1-98fe-1f1b8db21a1d up in Southbound
Sep 30 07:42:31 compute-0 openstack_network_exporter[201859]: ERROR   07:42:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:42:31 compute-0 openstack_network_exporter[201859]: ERROR   07:42:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:42:31 compute-0 openstack_network_exporter[201859]: ERROR   07:42:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:42:31 compute-0 openstack_network_exporter[201859]: ERROR   07:42:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:42:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:42:31 compute-0 nova_compute[189265]: 2025-09-30 07:42:31.413 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:42:31 compute-0 openstack_network_exporter[201859]: ERROR   07:42:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:42:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:42:31.448 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[3d336e07-fc0b-4ae5-99ad-6a8de0b25210]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:42:31 compute-0 systemd-udevd[223587]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:42:31.454 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[03b0e7f0-b869-49a9-b9c4-9128b9350eb1]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:42:31 compute-0 NetworkManager[51813]: <info>  [1759218151.4554] manager: (tapc37ccab9-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/89)
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:42:31.490 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[69ec9ffb-439a-4786-b4ab-ec34c28dde22]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:42:31.493 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[87174a73-1882-4fb4-ab1c-30ddb5c89553]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:42:31 compute-0 NetworkManager[51813]: <info>  [1759218151.5154] device (tapc37ccab9-b0): carrier: link connected
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:42:31.524 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[ec0c7d4b-c633-4736-bb31-b789f043ce67]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:42:31.540 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[7315ac83-f8ba-4378-a884-195e9b042457]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc37ccab9-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:87:8a:ba'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 64], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600923, 'reachable_time': 28999, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223616, 'error': None, 'target': 'ovnmeta-c37ccab9-b2b3-4600-9cd6-fc38d618b79f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:42:31.558 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[f740ab9c-2202-4bcb-818e-1abb93764447]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe87:8aba'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 600923, 'tstamp': 600923}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223617, 'error': None, 'target': 'ovnmeta-c37ccab9-b2b3-4600-9cd6-fc38d618b79f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:42:31.576 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[b05e72b9-2486-4d2f-bde7-d72f37d3d2d8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc37ccab9-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:87:8a:ba'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 64], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600923, 'reachable_time': 28999, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223618, 'error': None, 'target': 'ovnmeta-c37ccab9-b2b3-4600-9cd6-fc38d618b79f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:42:31 compute-0 nova_compute[189265]: 2025-09-30 07:42:31.593 2 DEBUG nova.compute.manager [req-9488989f-5bc1-495f-9809-482a958d6ae8 req-6a6771ad-a86f-4546-84cb-f0ce75cf1572 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Received event network-vif-plugged-103df688-88b9-4fd1-98fe-1f1b8db21a1d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:42:31 compute-0 nova_compute[189265]: 2025-09-30 07:42:31.593 2 DEBUG oslo_concurrency.lockutils [req-9488989f-5bc1-495f-9809-482a958d6ae8 req-6a6771ad-a86f-4546-84cb-f0ce75cf1572 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "6616fa8c-6043-4809-970f-befa571a47bf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:42:31 compute-0 nova_compute[189265]: 2025-09-30 07:42:31.593 2 DEBUG oslo_concurrency.lockutils [req-9488989f-5bc1-495f-9809-482a958d6ae8 req-6a6771ad-a86f-4546-84cb-f0ce75cf1572 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "6616fa8c-6043-4809-970f-befa571a47bf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:42:31 compute-0 nova_compute[189265]: 2025-09-30 07:42:31.594 2 DEBUG oslo_concurrency.lockutils [req-9488989f-5bc1-495f-9809-482a958d6ae8 req-6a6771ad-a86f-4546-84cb-f0ce75cf1572 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "6616fa8c-6043-4809-970f-befa571a47bf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:42:31 compute-0 nova_compute[189265]: 2025-09-30 07:42:31.594 2 DEBUG nova.compute.manager [req-9488989f-5bc1-495f-9809-482a958d6ae8 req-6a6771ad-a86f-4546-84cb-f0ce75cf1572 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Processing event network-vif-plugged-103df688-88b9-4fd1-98fe-1f1b8db21a1d _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:42:31.610 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[0a82239a-bb89-4c10-adac-2779d87be9bf]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:42:31.673 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[45185360-07c7-49bf-9a0e-bb23cacf514b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:42:31.674 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc37ccab9-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:42:31.675 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:42:31.675 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc37ccab9-b0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:42:31 compute-0 kernel: tapc37ccab9-b0: entered promiscuous mode
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:42:31.681 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc37ccab9-b0, col_values=(('external_ids', {'iface-id': '62cc21a6-25dd-4a68-bbc0-05c4bab51f8a'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:42:31 compute-0 NetworkManager[51813]: <info>  [1759218151.6811] manager: (tapc37ccab9-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/90)
Sep 30 07:42:31 compute-0 ovn_controller[91436]: 2025-09-30T07:42:31Z|00256|binding|INFO|Releasing lport 62cc21a6-25dd-4a68-bbc0-05c4bab51f8a from this chassis (sb_readonly=0)
Sep 30 07:42:31 compute-0 nova_compute[189265]: 2025-09-30 07:42:31.684 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:42:31 compute-0 nova_compute[189265]: 2025-09-30 07:42:31.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:42:31.700 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[243afb38-d5d9-4edd-8b3a-c124232e093c]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:42:31.701 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c37ccab9-b2b3-4600-9cd6-fc38d618b79f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c37ccab9-b2b3-4600-9cd6-fc38d618b79f.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:42:31.701 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c37ccab9-b2b3-4600-9cd6-fc38d618b79f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c37ccab9-b2b3-4600-9cd6-fc38d618b79f.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:42:31.701 100322 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for c37ccab9-b2b3-4600-9cd6-fc38d618b79f disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:42:31.701 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c37ccab9-b2b3-4600-9cd6-fc38d618b79f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c37ccab9-b2b3-4600-9cd6-fc38d618b79f.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:42:31.702 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[4601af67-58c7-4e39-b9d8-992bc6cceb62]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:42:31.702 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c37ccab9-b2b3-4600-9cd6-fc38d618b79f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c37ccab9-b2b3-4600-9cd6-fc38d618b79f.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:42:31.703 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[b3241f10-1b44-4e82-9902-c09a44e24946]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:42:31.703 100322 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]: global
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]:     log         /dev/log local0 debug
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]:     log-tag     haproxy-metadata-proxy-c37ccab9-b2b3-4600-9cd6-fc38d618b79f
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]:     user        root
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]:     group       root
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]:     maxconn     1024
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]:     pidfile     /var/lib/neutron/external/pids/c37ccab9-b2b3-4600-9cd6-fc38d618b79f.pid.haproxy
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]:     daemon
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]: 
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]: defaults
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]:     log global
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]:     mode http
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]:     option httplog
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]:     option dontlognull
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]:     option http-server-close
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]:     option forwardfor
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]:     retries                 3
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]:     timeout http-request    30s
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]:     timeout connect         30s
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]:     timeout client          32s
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]:     timeout server          32s
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]:     timeout http-keep-alive 30s
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]: 
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]: listen listener
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]:     bind 169.254.169.254:80
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]:     
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]: 
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]:     http-request add-header X-OVN-Network-ID c37ccab9-b2b3-4600-9cd6-fc38d618b79f
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Sep 30 07:42:31 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:42:31.704 100322 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c37ccab9-b2b3-4600-9cd6-fc38d618b79f', 'env', 'PROCESS_TAG=haproxy-c37ccab9-b2b3-4600-9cd6-fc38d618b79f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c37ccab9-b2b3-4600-9cd6-fc38d618b79f.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Sep 30 07:42:32 compute-0 podman[223656]: 2025-09-30 07:42:32.147102563 +0000 UTC m=+0.058500300 container create 971cf3f62cd4f68378814c4133ce4cc2f47af5011357a85aa85b272f4f75b337 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c37ccab9-b2b3-4600-9cd6-fc38d618b79f, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930)
Sep 30 07:42:32 compute-0 systemd[1]: Started libpod-conmon-971cf3f62cd4f68378814c4133ce4cc2f47af5011357a85aa85b272f4f75b337.scope.
Sep 30 07:42:32 compute-0 podman[223656]: 2025-09-30 07:42:32.115070883 +0000 UTC m=+0.026468690 image pull eeebcc09bc72f81ab45f5ab87eb8f6a7b554b949227aeec082bdb0732754ddc8 38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 07:42:32 compute-0 systemd[1]: Started libcrun container.
Sep 30 07:42:32 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4b8c8311359e8cbbe1370ab8c5db668a2a209a147dc25e549d690826104fa59b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 07:42:32 compute-0 podman[223656]: 2025-09-30 07:42:32.233673638 +0000 UTC m=+0.145071365 container init 971cf3f62cd4f68378814c4133ce4cc2f47af5011357a85aa85b272f4f75b337 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c37ccab9-b2b3-4600-9cd6-fc38d618b79f, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest)
Sep 30 07:42:32 compute-0 podman[223656]: 2025-09-30 07:42:32.239195796 +0000 UTC m=+0.150593513 container start 971cf3f62cd4f68378814c4133ce4cc2f47af5011357a85aa85b272f4f75b337 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c37ccab9-b2b3-4600-9cd6-fc38d618b79f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20250930, tcib_managed=true, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4)
Sep 30 07:42:32 compute-0 neutron-haproxy-ovnmeta-c37ccab9-b2b3-4600-9cd6-fc38d618b79f[223671]: [NOTICE]   (223675) : New worker (223677) forked
Sep 30 07:42:32 compute-0 neutron-haproxy-ovnmeta-c37ccab9-b2b3-4600-9cd6-fc38d618b79f[223671]: [NOTICE]   (223675) : Loading success.
Sep 30 07:42:32 compute-0 nova_compute[189265]: 2025-09-30 07:42:32.520 2 DEBUG nova.compute.manager [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 07:42:32 compute-0 nova_compute[189265]: 2025-09-30 07:42:32.524 2 DEBUG nova.virt.libvirt.driver [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Sep 30 07:42:32 compute-0 nova_compute[189265]: 2025-09-30 07:42:32.528 2 INFO nova.virt.libvirt.driver [-] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Instance spawned successfully.
Sep 30 07:42:32 compute-0 nova_compute[189265]: 2025-09-30 07:42:32.528 2 DEBUG nova.virt.libvirt.driver [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Sep 30 07:42:33 compute-0 nova_compute[189265]: 2025-09-30 07:42:33.045 2 DEBUG nova.virt.libvirt.driver [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:42:33 compute-0 nova_compute[189265]: 2025-09-30 07:42:33.046 2 DEBUG nova.virt.libvirt.driver [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:42:33 compute-0 nova_compute[189265]: 2025-09-30 07:42:33.046 2 DEBUG nova.virt.libvirt.driver [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:42:33 compute-0 nova_compute[189265]: 2025-09-30 07:42:33.047 2 DEBUG nova.virt.libvirt.driver [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:42:33 compute-0 nova_compute[189265]: 2025-09-30 07:42:33.048 2 DEBUG nova.virt.libvirt.driver [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:42:33 compute-0 nova_compute[189265]: 2025-09-30 07:42:33.049 2 DEBUG nova.virt.libvirt.driver [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:42:33 compute-0 podman[223686]: 2025-09-30 07:42:33.49180674 +0000 UTC m=+0.076334852 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.openshift.expose-services=, release=1755695350, vendor=Red Hat, Inc., container_name=openstack_network_exporter, architecture=x86_64, vcs-type=git, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.buildah.version=1.33.7, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal)
Sep 30 07:42:33 compute-0 nova_compute[189265]: 2025-09-30 07:42:33.562 2 INFO nova.compute.manager [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Took 10.29 seconds to spawn the instance on the hypervisor.
Sep 30 07:42:33 compute-0 nova_compute[189265]: 2025-09-30 07:42:33.562 2 DEBUG nova.compute.manager [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 07:42:33 compute-0 nova_compute[189265]: 2025-09-30 07:42:33.651 2 DEBUG nova.compute.manager [req-58925a04-d62c-4588-be90-67b8b1d110c7 req-68755df5-b3b1-4bf7-99f6-b72c26766e48 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Received event network-vif-plugged-103df688-88b9-4fd1-98fe-1f1b8db21a1d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:42:33 compute-0 nova_compute[189265]: 2025-09-30 07:42:33.651 2 DEBUG oslo_concurrency.lockutils [req-58925a04-d62c-4588-be90-67b8b1d110c7 req-68755df5-b3b1-4bf7-99f6-b72c26766e48 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "6616fa8c-6043-4809-970f-befa571a47bf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:42:33 compute-0 nova_compute[189265]: 2025-09-30 07:42:33.652 2 DEBUG oslo_concurrency.lockutils [req-58925a04-d62c-4588-be90-67b8b1d110c7 req-68755df5-b3b1-4bf7-99f6-b72c26766e48 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "6616fa8c-6043-4809-970f-befa571a47bf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:42:33 compute-0 nova_compute[189265]: 2025-09-30 07:42:33.653 2 DEBUG oslo_concurrency.lockutils [req-58925a04-d62c-4588-be90-67b8b1d110c7 req-68755df5-b3b1-4bf7-99f6-b72c26766e48 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "6616fa8c-6043-4809-970f-befa571a47bf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:42:33 compute-0 nova_compute[189265]: 2025-09-30 07:42:33.653 2 DEBUG nova.compute.manager [req-58925a04-d62c-4588-be90-67b8b1d110c7 req-68755df5-b3b1-4bf7-99f6-b72c26766e48 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] No waiting events found dispatching network-vif-plugged-103df688-88b9-4fd1-98fe-1f1b8db21a1d pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:42:33 compute-0 nova_compute[189265]: 2025-09-30 07:42:33.654 2 WARNING nova.compute.manager [req-58925a04-d62c-4588-be90-67b8b1d110c7 req-68755df5-b3b1-4bf7-99f6-b72c26766e48 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Received unexpected event network-vif-plugged-103df688-88b9-4fd1-98fe-1f1b8db21a1d for instance with vm_state building and task_state spawning.
Sep 30 07:42:33 compute-0 nova_compute[189265]: 2025-09-30 07:42:33.736 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:42:34 compute-0 nova_compute[189265]: 2025-09-30 07:42:34.130 2 INFO nova.compute.manager [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Took 15.56 seconds to build instance.
Sep 30 07:42:34 compute-0 nova_compute[189265]: 2025-09-30 07:42:34.388 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:42:34 compute-0 nova_compute[189265]: 2025-09-30 07:42:34.651 2 DEBUG oslo_concurrency.lockutils [None req-a4696be3-78d8-4b2d-bdac-6eabf89ec18c 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Lock "6616fa8c-6043-4809-970f-befa571a47bf" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.096s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:42:36 compute-0 podman[223708]: 2025-09-30 07:42:36.511446382 +0000 UTC m=+0.087928475 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team)
Sep 30 07:42:36 compute-0 podman[223709]: 2025-09-30 07:42:36.529399727 +0000 UTC m=+0.098212520 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 07:42:36 compute-0 podman[223710]: 2025-09-30 07:42:36.57618518 +0000 UTC m=+0.139221177 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 07:42:38 compute-0 nova_compute[189265]: 2025-09-30 07:42:38.739 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:42:39 compute-0 nova_compute[189265]: 2025-09-30 07:42:39.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:42:43 compute-0 nova_compute[189265]: 2025-09-30 07:42:43.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:42:43 compute-0 ovn_controller[91436]: 2025-09-30T07:42:43Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b7:a8:8b 10.100.0.5
Sep 30 07:42:43 compute-0 ovn_controller[91436]: 2025-09-30T07:42:43Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b7:a8:8b 10.100.0.5
Sep 30 07:42:44 compute-0 nova_compute[189265]: 2025-09-30 07:42:44.391 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:42:46 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Sep 30 07:42:48 compute-0 nova_compute[189265]: 2025-09-30 07:42:48.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:42:49 compute-0 nova_compute[189265]: 2025-09-30 07:42:49.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:42:51 compute-0 nova_compute[189265]: 2025-09-30 07:42:51.290 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:42:51 compute-0 podman[223782]: 2025-09-30 07:42:51.474525761 +0000 UTC m=+0.062324840 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 07:42:53 compute-0 nova_compute[189265]: 2025-09-30 07:42:53.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:42:53 compute-0 nova_compute[189265]: 2025-09-30 07:42:53.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:42:53 compute-0 nova_compute[189265]: 2025-09-30 07:42:53.806 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:42:54 compute-0 nova_compute[189265]: 2025-09-30 07:42:54.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:42:54 compute-0 nova_compute[189265]: 2025-09-30 07:42:54.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:42:54 compute-0 nova_compute[189265]: 2025-09-30 07:42:54.788 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:42:58 compute-0 nova_compute[189265]: 2025-09-30 07:42:58.850 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:42:59 compute-0 nova_compute[189265]: 2025-09-30 07:42:59.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:42:59 compute-0 podman[199733]: time="2025-09-30T07:42:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:42:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:42:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 07:42:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:42:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3482 "" "Go-http-client/1.1"
Sep 30 07:43:00 compute-0 podman[223808]: 2025-09-30 07:43:00.484751639 +0000 UTC m=+0.059717065 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.license=GPLv2)
Sep 30 07:43:01 compute-0 openstack_network_exporter[201859]: ERROR   07:43:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:43:01 compute-0 openstack_network_exporter[201859]: ERROR   07:43:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:43:01 compute-0 openstack_network_exporter[201859]: ERROR   07:43:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:43:01 compute-0 openstack_network_exporter[201859]: ERROR   07:43:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:43:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:43:01 compute-0 openstack_network_exporter[201859]: ERROR   07:43:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:43:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:43:01 compute-0 ovn_controller[91436]: 2025-09-30T07:43:01Z|00257|memory_trim|INFO|Detected inactivity (last active 30000 ms ago): trimming memory
Sep 30 07:43:02 compute-0 nova_compute[189265]: 2025-09-30 07:43:02.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:43:03 compute-0 nova_compute[189265]: 2025-09-30 07:43:03.308 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:43:03 compute-0 nova_compute[189265]: 2025-09-30 07:43:03.311 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.003s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:43:03 compute-0 nova_compute[189265]: 2025-09-30 07:43:03.312 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:43:03 compute-0 nova_compute[189265]: 2025-09-30 07:43:03.312 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:43:03 compute-0 nova_compute[189265]: 2025-09-30 07:43:03.903 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:04 compute-0 nova_compute[189265]: 2025-09-30 07:43:04.403 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:04 compute-0 nova_compute[189265]: 2025-09-30 07:43:04.417 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6616fa8c-6043-4809-970f-befa571a47bf/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:43:04 compute-0 podman[223830]: 2025-09-30 07:43:04.486494461 +0000 UTC m=+0.065602985 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, config_id=edpm, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.expose-services=, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, container_name=openstack_network_exporter, distribution-scope=public)
Sep 30 07:43:04 compute-0 nova_compute[189265]: 2025-09-30 07:43:04.489 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6616fa8c-6043-4809-970f-befa571a47bf/disk --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:43:04 compute-0 nova_compute[189265]: 2025-09-30 07:43:04.490 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6616fa8c-6043-4809-970f-befa571a47bf/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:43:04 compute-0 nova_compute[189265]: 2025-09-30 07:43:04.545 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6616fa8c-6043-4809-970f-befa571a47bf/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:43:04 compute-0 nova_compute[189265]: 2025-09-30 07:43:04.724 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:43:04 compute-0 nova_compute[189265]: 2025-09-30 07:43:04.726 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:43:04 compute-0 nova_compute[189265]: 2025-09-30 07:43:04.743 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.017s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:43:04 compute-0 nova_compute[189265]: 2025-09-30 07:43:04.744 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5669MB free_disk=73.27468490600586GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:43:04 compute-0 nova_compute[189265]: 2025-09-30 07:43:04.745 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:43:04 compute-0 nova_compute[189265]: 2025-09-30 07:43:04.745 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:43:05 compute-0 nova_compute[189265]: 2025-09-30 07:43:05.797 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Instance 6616fa8c-6043-4809-970f-befa571a47bf actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 07:43:05 compute-0 nova_compute[189265]: 2025-09-30 07:43:05.798 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:43:05 compute-0 nova_compute[189265]: 2025-09-30 07:43:05.798 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:43:04 up  1:40,  0 user,  load average: 0.30, 0.33, 0.30\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_eb449ca8f36d45d88d1ef08bcb192ca6': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:43:05 compute-0 nova_compute[189265]: 2025-09-30 07:43:05.839 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:43:06 compute-0 nova_compute[189265]: 2025-09-30 07:43:06.346 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:43:06 compute-0 nova_compute[189265]: 2025-09-30 07:43:06.858 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:43:06 compute-0 nova_compute[189265]: 2025-09-30 07:43:06.859 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.114s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:43:07 compute-0 podman[223859]: 2025-09-30 07:43:07.475569845 +0000 UTC m=+0.058052547 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4)
Sep 30 07:43:07 compute-0 podman[223860]: 2025-09-30 07:43:07.49628784 +0000 UTC m=+0.079108072 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Sep 30 07:43:07 compute-0 podman[223858]: 2025-09-30 07:43:07.530832971 +0000 UTC m=+0.111386898 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Sep 30 07:43:08 compute-0 nova_compute[189265]: 2025-09-30 07:43:08.859 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:43:08 compute-0 nova_compute[189265]: 2025-09-30 07:43:08.859 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:43:08 compute-0 nova_compute[189265]: 2025-09-30 07:43:08.906 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:09 compute-0 nova_compute[189265]: 2025-09-30 07:43:09.037 2 DEBUG nova.virt.libvirt.driver [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 6e93e66e-2937-4004-b34f-3fe033cb2935] Creating tmpfile /var/lib/nova/instances/tmppuew7nxn to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Sep 30 07:43:09 compute-0 nova_compute[189265]: 2025-09-30 07:43:09.038 2 WARNING neutronclient.v2_0.client [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:43:09 compute-0 nova_compute[189265]: 2025-09-30 07:43:09.141 2 DEBUG nova.compute.manager [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmppuew7nxn',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Sep 30 07:43:09 compute-0 nova_compute[189265]: 2025-09-30 07:43:09.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:09 compute-0 nova_compute[189265]: 2025-09-30 07:43:09.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:43:11 compute-0 nova_compute[189265]: 2025-09-30 07:43:11.178 2 WARNING neutronclient.v2_0.client [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:43:13 compute-0 nova_compute[189265]: 2025-09-30 07:43:13.908 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:14 compute-0 nova_compute[189265]: 2025-09-30 07:43:14.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:16 compute-0 nova_compute[189265]: 2025-09-30 07:43:16.840 2 DEBUG nova.compute.manager [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmppuew7nxn',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='6e93e66e-2937-4004-b34f-3fe033cb2935',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Sep 30 07:43:17 compute-0 nova_compute[189265]: 2025-09-30 07:43:17.861 2 DEBUG oslo_concurrency.lockutils [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "refresh_cache-6e93e66e-2937-4004-b34f-3fe033cb2935" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:43:17 compute-0 nova_compute[189265]: 2025-09-30 07:43:17.862 2 DEBUG oslo_concurrency.lockutils [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquired lock "refresh_cache-6e93e66e-2937-4004-b34f-3fe033cb2935" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:43:17 compute-0 nova_compute[189265]: 2025-09-30 07:43:17.862 2 DEBUG nova.network.neutron [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 6e93e66e-2937-4004-b34f-3fe033cb2935] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 07:43:18 compute-0 nova_compute[189265]: 2025-09-30 07:43:18.368 2 WARNING neutronclient.v2_0.client [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:43:18 compute-0 nova_compute[189265]: 2025-09-30 07:43:18.954 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:19 compute-0 nova_compute[189265]: 2025-09-30 07:43:19.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:19 compute-0 nova_compute[189265]: 2025-09-30 07:43:19.412 2 WARNING neutronclient.v2_0.client [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:43:20 compute-0 nova_compute[189265]: 2025-09-30 07:43:20.122 2 DEBUG nova.network.neutron [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 6e93e66e-2937-4004-b34f-3fe033cb2935] Updating instance_info_cache with network_info: [{"id": "4d31ed33-b19b-46b2-80ed-c3276286d105", "address": "fa:16:3e:95:74:6c", "network": {"id": "c37ccab9-b2b3-4600-9cd6-fc38d618b79f", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1008995657-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75afc4c4c3cd416898ef46cd7b7e99de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d31ed33-b1", "ovs_interfaceid": "4d31ed33-b19b-46b2-80ed-c3276286d105", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:43:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:43:20.585 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:43:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:43:20.586 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:43:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:43:20.586 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:43:20 compute-0 nova_compute[189265]: 2025-09-30 07:43:20.632 2 DEBUG oslo_concurrency.lockutils [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Releasing lock "refresh_cache-6e93e66e-2937-4004-b34f-3fe033cb2935" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:43:20 compute-0 nova_compute[189265]: 2025-09-30 07:43:20.661 2 DEBUG nova.virt.libvirt.driver [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 6e93e66e-2937-4004-b34f-3fe033cb2935] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmppuew7nxn',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='6e93e66e-2937-4004-b34f-3fe033cb2935',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Sep 30 07:43:20 compute-0 nova_compute[189265]: 2025-09-30 07:43:20.662 2 DEBUG nova.virt.libvirt.driver [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 6e93e66e-2937-4004-b34f-3fe033cb2935] Creating instance directory: /var/lib/nova/instances/6e93e66e-2937-4004-b34f-3fe033cb2935 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Sep 30 07:43:20 compute-0 nova_compute[189265]: 2025-09-30 07:43:20.663 2 DEBUG nova.virt.libvirt.driver [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 6e93e66e-2937-4004-b34f-3fe033cb2935] Creating disk.info with the contents: {'/var/lib/nova/instances/6e93e66e-2937-4004-b34f-3fe033cb2935/disk': 'qcow2', '/var/lib/nova/instances/6e93e66e-2937-4004-b34f-3fe033cb2935/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Sep 30 07:43:20 compute-0 nova_compute[189265]: 2025-09-30 07:43:20.664 2 DEBUG nova.virt.libvirt.driver [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 6e93e66e-2937-4004-b34f-3fe033cb2935] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Sep 30 07:43:20 compute-0 nova_compute[189265]: 2025-09-30 07:43:20.665 2 DEBUG nova.objects.instance [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lazy-loading 'trusted_certs' on Instance uuid 6e93e66e-2937-4004-b34f-3fe033cb2935 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:43:21 compute-0 nova_compute[189265]: 2025-09-30 07:43:21.174 2 DEBUG oslo_utils.imageutils.format_inspector [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:43:21 compute-0 nova_compute[189265]: 2025-09-30 07:43:21.180 2 DEBUG oslo_utils.imageutils.format_inspector [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:43:21 compute-0 nova_compute[189265]: 2025-09-30 07:43:21.182 2 DEBUG oslo_concurrency.processutils [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:43:21 compute-0 nova_compute[189265]: 2025-09-30 07:43:21.268 2 DEBUG oslo_concurrency.processutils [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:43:21 compute-0 nova_compute[189265]: 2025-09-30 07:43:21.270 2 DEBUG oslo_concurrency.lockutils [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "649c128805005f3dfb5a93843c58a367cdfe939d" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:43:21 compute-0 nova_compute[189265]: 2025-09-30 07:43:21.271 2 DEBUG oslo_concurrency.lockutils [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "649c128805005f3dfb5a93843c58a367cdfe939d" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:43:21 compute-0 nova_compute[189265]: 2025-09-30 07:43:21.272 2 DEBUG oslo_utils.imageutils.format_inspector [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:43:21 compute-0 nova_compute[189265]: 2025-09-30 07:43:21.278 2 DEBUG oslo_utils.imageutils.format_inspector [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:43:21 compute-0 nova_compute[189265]: 2025-09-30 07:43:21.279 2 DEBUG oslo_concurrency.processutils [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:43:21 compute-0 nova_compute[189265]: 2025-09-30 07:43:21.354 2 DEBUG oslo_concurrency.processutils [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:43:21 compute-0 nova_compute[189265]: 2025-09-30 07:43:21.356 2 DEBUG oslo_concurrency.processutils [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d,backing_fmt=raw /var/lib/nova/instances/6e93e66e-2937-4004-b34f-3fe033cb2935/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:43:21 compute-0 nova_compute[189265]: 2025-09-30 07:43:21.407 2 DEBUG oslo_concurrency.processutils [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d,backing_fmt=raw /var/lib/nova/instances/6e93e66e-2937-4004-b34f-3fe033cb2935/disk 1073741824" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:43:21 compute-0 nova_compute[189265]: 2025-09-30 07:43:21.409 2 DEBUG oslo_concurrency.lockutils [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "649c128805005f3dfb5a93843c58a367cdfe939d" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.138s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:43:21 compute-0 nova_compute[189265]: 2025-09-30 07:43:21.409 2 DEBUG oslo_concurrency.processutils [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:43:21 compute-0 nova_compute[189265]: 2025-09-30 07:43:21.490 2 DEBUG oslo_concurrency.processutils [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:43:21 compute-0 nova_compute[189265]: 2025-09-30 07:43:21.492 2 DEBUG nova.virt.disk.api [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Checking if we can resize image /var/lib/nova/instances/6e93e66e-2937-4004-b34f-3fe033cb2935/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Sep 30 07:43:21 compute-0 nova_compute[189265]: 2025-09-30 07:43:21.493 2 DEBUG oslo_concurrency.processutils [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6e93e66e-2937-4004-b34f-3fe033cb2935/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:43:21 compute-0 nova_compute[189265]: 2025-09-30 07:43:21.543 2 DEBUG oslo_concurrency.processutils [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6e93e66e-2937-4004-b34f-3fe033cb2935/disk --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:43:21 compute-0 nova_compute[189265]: 2025-09-30 07:43:21.545 2 DEBUG nova.virt.disk.api [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Cannot resize image /var/lib/nova/instances/6e93e66e-2937-4004-b34f-3fe033cb2935/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Sep 30 07:43:21 compute-0 nova_compute[189265]: 2025-09-30 07:43:21.545 2 DEBUG nova.objects.instance [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lazy-loading 'migration_context' on Instance uuid 6e93e66e-2937-4004-b34f-3fe033cb2935 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:43:22 compute-0 nova_compute[189265]: 2025-09-30 07:43:22.053 2 DEBUG nova.objects.base [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Object Instance<6e93e66e-2937-4004-b34f-3fe033cb2935> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Sep 30 07:43:22 compute-0 nova_compute[189265]: 2025-09-30 07:43:22.054 2 DEBUG oslo_concurrency.processutils [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/6e93e66e-2937-4004-b34f-3fe033cb2935/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:43:22 compute-0 nova_compute[189265]: 2025-09-30 07:43:22.088 2 DEBUG oslo_concurrency.processutils [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/6e93e66e-2937-4004-b34f-3fe033cb2935/disk.config 497664" returned: 0 in 0.034s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:43:22 compute-0 nova_compute[189265]: 2025-09-30 07:43:22.089 2 DEBUG nova.virt.libvirt.driver [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 6e93e66e-2937-4004-b34f-3fe033cb2935] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Sep 30 07:43:22 compute-0 nova_compute[189265]: 2025-09-30 07:43:22.090 2 DEBUG nova.virt.libvirt.vif [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T07:41:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-2113466578',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-2113466578',id=26,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T07:42:10Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='eb449ca8f36d45d88d1ef08bcb192ca6',ramdisk_id='',reservation_id='r-5v2dyfq0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-264644006',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-264644006-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T07:42:10Z,user_data=None,user_id='5c02a0a41ab14f6a92e1e6e2798736ae',uuid=6e93e66e-2937-4004-b34f-3fe033cb2935,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4d31ed33-b19b-46b2-80ed-c3276286d105", "address": "fa:16:3e:95:74:6c", "network": {"id": "c37ccab9-b2b3-4600-9cd6-fc38d618b79f", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1008995657-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75afc4c4c3cd416898ef46cd7b7e99de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap4d31ed33-b1", "ovs_interfaceid": "4d31ed33-b19b-46b2-80ed-c3276286d105", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 07:43:22 compute-0 nova_compute[189265]: 2025-09-30 07:43:22.090 2 DEBUG nova.network.os_vif_util [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Converting VIF {"id": "4d31ed33-b19b-46b2-80ed-c3276286d105", "address": "fa:16:3e:95:74:6c", "network": {"id": "c37ccab9-b2b3-4600-9cd6-fc38d618b79f", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1008995657-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75afc4c4c3cd416898ef46cd7b7e99de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap4d31ed33-b1", "ovs_interfaceid": "4d31ed33-b19b-46b2-80ed-c3276286d105", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:43:22 compute-0 nova_compute[189265]: 2025-09-30 07:43:22.090 2 DEBUG nova.network.os_vif_util [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:95:74:6c,bridge_name='br-int',has_traffic_filtering=True,id=4d31ed33-b19b-46b2-80ed-c3276286d105,network=Network(c37ccab9-b2b3-4600-9cd6-fc38d618b79f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d31ed33-b1') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:43:22 compute-0 nova_compute[189265]: 2025-09-30 07:43:22.091 2 DEBUG os_vif [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:95:74:6c,bridge_name='br-int',has_traffic_filtering=True,id=4d31ed33-b19b-46b2-80ed-c3276286d105,network=Network(c37ccab9-b2b3-4600-9cd6-fc38d618b79f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d31ed33-b1') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 07:43:22 compute-0 nova_compute[189265]: 2025-09-30 07:43:22.091 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:22 compute-0 nova_compute[189265]: 2025-09-30 07:43:22.092 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:43:22 compute-0 nova_compute[189265]: 2025-09-30 07:43:22.092 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:43:22 compute-0 nova_compute[189265]: 2025-09-30 07:43:22.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:22 compute-0 nova_compute[189265]: 2025-09-30 07:43:22.093 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '6d9a99ad-edcc-5b24-bdb8-d6ebfe7999c5', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:43:22 compute-0 nova_compute[189265]: 2025-09-30 07:43:22.094 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:22 compute-0 nova_compute[189265]: 2025-09-30 07:43:22.096 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:22 compute-0 nova_compute[189265]: 2025-09-30 07:43:22.097 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:22 compute-0 nova_compute[189265]: 2025-09-30 07:43:22.098 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4d31ed33-b1, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:43:22 compute-0 nova_compute[189265]: 2025-09-30 07:43:22.098 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap4d31ed33-b1, col_values=(('qos', UUID('ef223baf-7233-4d96-ac01-72eab59443ca')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:43:22 compute-0 nova_compute[189265]: 2025-09-30 07:43:22.098 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap4d31ed33-b1, col_values=(('external_ids', {'iface-id': '4d31ed33-b19b-46b2-80ed-c3276286d105', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:95:74:6c', 'vm-uuid': '6e93e66e-2937-4004-b34f-3fe033cb2935'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:43:22 compute-0 nova_compute[189265]: 2025-09-30 07:43:22.099 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:22 compute-0 NetworkManager[51813]: <info>  [1759218202.1007] manager: (tap4d31ed33-b1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/91)
Sep 30 07:43:22 compute-0 nova_compute[189265]: 2025-09-30 07:43:22.101 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:43:22 compute-0 nova_compute[189265]: 2025-09-30 07:43:22.106 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:22 compute-0 nova_compute[189265]: 2025-09-30 07:43:22.106 2 INFO os_vif [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:95:74:6c,bridge_name='br-int',has_traffic_filtering=True,id=4d31ed33-b19b-46b2-80ed-c3276286d105,network=Network(c37ccab9-b2b3-4600-9cd6-fc38d618b79f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d31ed33-b1')
Sep 30 07:43:22 compute-0 nova_compute[189265]: 2025-09-30 07:43:22.106 2 DEBUG nova.virt.libvirt.driver [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Sep 30 07:43:22 compute-0 nova_compute[189265]: 2025-09-30 07:43:22.107 2 DEBUG nova.compute.manager [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmppuew7nxn',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='6e93e66e-2937-4004-b34f-3fe033cb2935',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Sep 30 07:43:22 compute-0 nova_compute[189265]: 2025-09-30 07:43:22.107 2 WARNING neutronclient.v2_0.client [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:43:22 compute-0 podman[223941]: 2025-09-30 07:43:22.498924793 +0000 UTC m=+0.077359021 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 07:43:22 compute-0 nova_compute[189265]: 2025-09-30 07:43:22.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:43:22 compute-0 nova_compute[189265]: 2025-09-30 07:43:22.788 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:43:22 compute-0 nova_compute[189265]: 2025-09-30 07:43:22.789 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:43:22 compute-0 nova_compute[189265]: 2025-09-30 07:43:22.790 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:43:22 compute-0 nova_compute[189265]: 2025-09-30 07:43:22.791 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:43:22 compute-0 nova_compute[189265]: 2025-09-30 07:43:22.792 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:43:22 compute-0 nova_compute[189265]: 2025-09-30 07:43:22.792 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:43:23 compute-0 nova_compute[189265]: 2025-09-30 07:43:23.068 2 WARNING neutronclient.v2_0.client [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:43:23 compute-0 nova_compute[189265]: 2025-09-30 07:43:23.810 2 DEBUG nova.virt.libvirt.imagecache [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:314
Sep 30 07:43:23 compute-0 nova_compute[189265]: 2025-09-30 07:43:23.810 2 DEBUG nova.virt.libvirt.imagecache [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Image id 0c6b92f5-9861-49e4-862d-3ffd84520dfa yields fingerprint 649c128805005f3dfb5a93843c58a367cdfe939d _age_and_verify_cached_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:319
Sep 30 07:43:23 compute-0 nova_compute[189265]: 2025-09-30 07:43:23.811 2 INFO nova.virt.libvirt.imagecache [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] image 0c6b92f5-9861-49e4-862d-3ffd84520dfa at (/var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d): checking
Sep 30 07:43:23 compute-0 nova_compute[189265]: 2025-09-30 07:43:23.811 2 DEBUG nova.virt.libvirt.imagecache [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] image 0c6b92f5-9861-49e4-862d-3ffd84520dfa at (/var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d): image is in use _mark_in_use /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:279
Sep 30 07:43:23 compute-0 nova_compute[189265]: 2025-09-30 07:43:23.812 2 DEBUG nova.virt.libvirt.imagecache [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Image id  yields fingerprint da39a3ee5e6b4b0d3255bfef95601890afd80709 _age_and_verify_cached_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:319
Sep 30 07:43:23 compute-0 nova_compute[189265]: 2025-09-30 07:43:23.813 2 DEBUG nova.virt.libvirt.imagecache [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] 6616fa8c-6043-4809-970f-befa571a47bf is a valid instance name _list_backing_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:126
Sep 30 07:43:23 compute-0 nova_compute[189265]: 2025-09-30 07:43:23.813 2 DEBUG nova.virt.libvirt.imagecache [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] 6616fa8c-6043-4809-970f-befa571a47bf has a disk file _list_backing_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:129
Sep 30 07:43:23 compute-0 nova_compute[189265]: 2025-09-30 07:43:23.813 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6616fa8c-6043-4809-970f-befa571a47bf/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:43:23 compute-0 nova_compute[189265]: 2025-09-30 07:43:23.877 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/6616fa8c-6043-4809-970f-befa571a47bf/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:43:23 compute-0 nova_compute[189265]: 2025-09-30 07:43:23.878 2 DEBUG nova.virt.libvirt.imagecache [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Instance 6616fa8c-6043-4809-970f-befa571a47bf is backed by 649c128805005f3dfb5a93843c58a367cdfe939d _list_backing_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:141
Sep 30 07:43:23 compute-0 nova_compute[189265]: 2025-09-30 07:43:23.878 2 INFO nova.virt.libvirt.imagecache [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Active base files: /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d
Sep 30 07:43:23 compute-0 nova_compute[189265]: 2025-09-30 07:43:23.879 2 DEBUG nova.virt.libvirt.imagecache [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:350
Sep 30 07:43:23 compute-0 nova_compute[189265]: 2025-09-30 07:43:23.879 2 DEBUG nova.virt.libvirt.imagecache [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:299
Sep 30 07:43:23 compute-0 nova_compute[189265]: 2025-09-30 07:43:23.879 2 DEBUG nova.virt.libvirt.imagecache [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.12/site-packages/nova/virt/libvirt/imagecache.py:284
Sep 30 07:43:24 compute-0 nova_compute[189265]: 2025-09-30 07:43:24.410 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:24 compute-0 nova_compute[189265]: 2025-09-30 07:43:24.914 2 DEBUG nova.network.neutron [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 6e93e66e-2937-4004-b34f-3fe033cb2935] Port 4d31ed33-b19b-46b2-80ed-c3276286d105 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Sep 30 07:43:24 compute-0 nova_compute[189265]: 2025-09-30 07:43:24.932 2 DEBUG nova.compute.manager [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmppuew7nxn',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='6e93e66e-2937-4004-b34f-3fe033cb2935',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Sep 30 07:43:27 compute-0 nova_compute[189265]: 2025-09-30 07:43:27.099 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:27 compute-0 systemd[1]: Starting libvirt proxy daemon...
Sep 30 07:43:27 compute-0 systemd[1]: Started libvirt proxy daemon.
Sep 30 07:43:28 compute-0 kernel: tap4d31ed33-b1: entered promiscuous mode
Sep 30 07:43:28 compute-0 nova_compute[189265]: 2025-09-30 07:43:28.180 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:28 compute-0 ovn_controller[91436]: 2025-09-30T07:43:28Z|00258|binding|INFO|Claiming lport 4d31ed33-b19b-46b2-80ed-c3276286d105 for this additional chassis.
Sep 30 07:43:28 compute-0 ovn_controller[91436]: 2025-09-30T07:43:28Z|00259|binding|INFO|4d31ed33-b19b-46b2-80ed-c3276286d105: Claiming fa:16:3e:95:74:6c 10.100.0.3
Sep 30 07:43:28 compute-0 NetworkManager[51813]: <info>  [1759218208.1858] manager: (tap4d31ed33-b1): new Tun device (/org/freedesktop/NetworkManager/Devices/92)
Sep 30 07:43:28 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:43:28.188 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:95:74:6c 10.100.0.3'], port_security=['fa:16:3e:95:74:6c 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '6e93e66e-2937-4004-b34f-3fe033cb2935', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c37ccab9-b2b3-4600-9cd6-fc38d618b79f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'eb449ca8f36d45d88d1ef08bcb192ca6', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'd8d8aea8-dfd7-4028-b831-7a1c1bc3f21e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3a63f614-d134-4922-9cae-18c0918b6eb4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=4d31ed33-b19b-46b2-80ed-c3276286d105) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:43:28 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:43:28.193 100322 INFO neutron.agent.ovn.metadata.agent [-] Port 4d31ed33-b19b-46b2-80ed-c3276286d105 in datapath c37ccab9-b2b3-4600-9cd6-fc38d618b79f unbound from our chassis
Sep 30 07:43:28 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:43:28.194 100322 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c37ccab9-b2b3-4600-9cd6-fc38d618b79f
Sep 30 07:43:28 compute-0 ovn_controller[91436]: 2025-09-30T07:43:28Z|00260|binding|INFO|Setting lport 4d31ed33-b19b-46b2-80ed-c3276286d105 ovn-installed in OVS
Sep 30 07:43:28 compute-0 nova_compute[189265]: 2025-09-30 07:43:28.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:28 compute-0 systemd-udevd[223999]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 07:43:28 compute-0 nova_compute[189265]: 2025-09-30 07:43:28.211 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:28 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:43:28.218 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[70048517-9dc2-423b-9e0e-6d0f39e28942]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:43:28 compute-0 NetworkManager[51813]: <info>  [1759218208.2308] device (tap4d31ed33-b1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 07:43:28 compute-0 NetworkManager[51813]: <info>  [1759218208.2317] device (tap4d31ed33-b1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 07:43:28 compute-0 systemd-machined[149233]: New machine qemu-22-instance-0000001a.
Sep 30 07:43:28 compute-0 systemd[1]: Started Virtual Machine qemu-22-instance-0000001a.
Sep 30 07:43:28 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:43:28.261 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[83e22bc2-aea9-487e-8240-76b76d9ed866]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:43:28 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:43:28.265 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[ad1de08c-68d8-4d5c-bfa0-2ace04ad74be]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:43:28 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:43:28.309 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[73abe4f3-8d45-42b7-823d-c789563b848e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:43:28 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:43:28.336 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[1a3ac310-08ec-4c32-ae85-9a7f23e51c83]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc37ccab9-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:87:8a:ba'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 64], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600923, 'reachable_time': 28999, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224014, 'error': None, 'target': 'ovnmeta-c37ccab9-b2b3-4600-9cd6-fc38d618b79f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:43:28 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:43:28.360 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[6841a189-739e-46a2-a48d-303c83aa4de6]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapc37ccab9-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 600935, 'tstamp': 600935}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224015, 'error': None, 'target': 'ovnmeta-c37ccab9-b2b3-4600-9cd6-fc38d618b79f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapc37ccab9-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 600938, 'tstamp': 600938}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224015, 'error': None, 'target': 'ovnmeta-c37ccab9-b2b3-4600-9cd6-fc38d618b79f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:43:28 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:43:28.361 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc37ccab9-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:43:28 compute-0 nova_compute[189265]: 2025-09-30 07:43:28.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:28 compute-0 nova_compute[189265]: 2025-09-30 07:43:28.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:28 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:43:28.364 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc37ccab9-b0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:43:28 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:43:28.364 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:43:28 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:43:28.364 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc37ccab9-b0, col_values=(('external_ids', {'iface-id': '62cc21a6-25dd-4a68-bbc0-05c4bab51f8a'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:43:28 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:43:28.364 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:43:28 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:43:28.365 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[f0c28dde-79a9-4a14-aad0-3c5e78a90ab3]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-c37ccab9-b2b3-4600-9cd6-fc38d618b79f\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/c37ccab9-b2b3-4600-9cd6-fc38d618b79f.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID c37ccab9-b2b3-4600-9cd6-fc38d618b79f\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:43:29 compute-0 nova_compute[189265]: 2025-09-30 07:43:29.411 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:29 compute-0 podman[199733]: time="2025-09-30T07:43:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:43:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:43:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 07:43:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:43:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3479 "" "Go-http-client/1.1"
Sep 30 07:43:31 compute-0 openstack_network_exporter[201859]: ERROR   07:43:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:43:31 compute-0 openstack_network_exporter[201859]: ERROR   07:43:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:43:31 compute-0 openstack_network_exporter[201859]: ERROR   07:43:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:43:31 compute-0 openstack_network_exporter[201859]: ERROR   07:43:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:43:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:43:31 compute-0 openstack_network_exporter[201859]: ERROR   07:43:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:43:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:43:31 compute-0 podman[224038]: 2025-09-30 07:43:31.529874917 +0000 UTC m=+0.099598930 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, container_name=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 07:43:32 compute-0 nova_compute[189265]: 2025-09-30 07:43:32.101 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:32 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:43:32.119 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=30, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '1a:26:7c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2e:60:fa:91:d0:34'}, ipsec=False) old=SB_Global(nb_cfg=29) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:43:32 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:43:32.120 100322 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 07:43:32 compute-0 nova_compute[189265]: 2025-09-30 07:43:32.121 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:32 compute-0 ovn_controller[91436]: 2025-09-30T07:43:32Z|00261|binding|INFO|Claiming lport 4d31ed33-b19b-46b2-80ed-c3276286d105 for this chassis.
Sep 30 07:43:32 compute-0 ovn_controller[91436]: 2025-09-30T07:43:32Z|00262|binding|INFO|4d31ed33-b19b-46b2-80ed-c3276286d105: Claiming fa:16:3e:95:74:6c 10.100.0.3
Sep 30 07:43:32 compute-0 ovn_controller[91436]: 2025-09-30T07:43:32Z|00263|binding|INFO|Setting lport 4d31ed33-b19b-46b2-80ed-c3276286d105 up in Southbound
Sep 30 07:43:33 compute-0 nova_compute[189265]: 2025-09-30 07:43:33.805 2 INFO nova.compute.manager [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 6e93e66e-2937-4004-b34f-3fe033cb2935] Post operation of migration started
Sep 30 07:43:33 compute-0 nova_compute[189265]: 2025-09-30 07:43:33.805 2 WARNING neutronclient.v2_0.client [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:43:34 compute-0 nova_compute[189265]: 2025-09-30 07:43:34.052 2 WARNING neutronclient.v2_0.client [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:43:34 compute-0 nova_compute[189265]: 2025-09-30 07:43:34.053 2 WARNING neutronclient.v2_0.client [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:43:34 compute-0 nova_compute[189265]: 2025-09-30 07:43:34.174 2 DEBUG oslo_concurrency.lockutils [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "refresh_cache-6e93e66e-2937-4004-b34f-3fe033cb2935" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:43:34 compute-0 nova_compute[189265]: 2025-09-30 07:43:34.174 2 DEBUG oslo_concurrency.lockutils [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquired lock "refresh_cache-6e93e66e-2937-4004-b34f-3fe033cb2935" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:43:34 compute-0 nova_compute[189265]: 2025-09-30 07:43:34.175 2 DEBUG nova.network.neutron [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 6e93e66e-2937-4004-b34f-3fe033cb2935] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 07:43:34 compute-0 nova_compute[189265]: 2025-09-30 07:43:34.414 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:34 compute-0 nova_compute[189265]: 2025-09-30 07:43:34.683 2 WARNING neutronclient.v2_0.client [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:43:35 compute-0 nova_compute[189265]: 2025-09-30 07:43:35.264 2 WARNING neutronclient.v2_0.client [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:43:35 compute-0 podman[224058]: 2025-09-30 07:43:35.461283911 +0000 UTC m=+0.051595752 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, config_id=edpm, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1755695350, vendor=Red Hat, Inc., version=9.6)
Sep 30 07:43:35 compute-0 nova_compute[189265]: 2025-09-30 07:43:35.505 2 DEBUG nova.network.neutron [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 6e93e66e-2937-4004-b34f-3fe033cb2935] Updating instance_info_cache with network_info: [{"id": "4d31ed33-b19b-46b2-80ed-c3276286d105", "address": "fa:16:3e:95:74:6c", "network": {"id": "c37ccab9-b2b3-4600-9cd6-fc38d618b79f", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1008995657-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75afc4c4c3cd416898ef46cd7b7e99de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d31ed33-b1", "ovs_interfaceid": "4d31ed33-b19b-46b2-80ed-c3276286d105", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:43:36 compute-0 nova_compute[189265]: 2025-09-30 07:43:36.013 2 DEBUG oslo_concurrency.lockutils [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Releasing lock "refresh_cache-6e93e66e-2937-4004-b34f-3fe033cb2935" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:43:36 compute-0 nova_compute[189265]: 2025-09-30 07:43:36.534 2 DEBUG oslo_concurrency.lockutils [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:43:36 compute-0 nova_compute[189265]: 2025-09-30 07:43:36.535 2 DEBUG oslo_concurrency.lockutils [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:43:36 compute-0 nova_compute[189265]: 2025-09-30 07:43:36.535 2 DEBUG oslo_concurrency.lockutils [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:43:36 compute-0 nova_compute[189265]: 2025-09-30 07:43:36.542 2 INFO nova.virt.libvirt.driver [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 6e93e66e-2937-4004-b34f-3fe033cb2935] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Sep 30 07:43:36 compute-0 virtqemud[189090]: Domain id=22 name='instance-0000001a' uuid=6e93e66e-2937-4004-b34f-3fe033cb2935 is tainted: custom-monitor
Sep 30 07:43:37 compute-0 nova_compute[189265]: 2025-09-30 07:43:37.103 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:37 compute-0 nova_compute[189265]: 2025-09-30 07:43:37.551 2 INFO nova.virt.libvirt.driver [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 6e93e66e-2937-4004-b34f-3fe033cb2935] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Sep 30 07:43:38 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:43:38.122 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=01429670-4ea1-4dab-babc-4bc628cc01bb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '30'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:43:38 compute-0 podman[224082]: 2025-09-30 07:43:38.506774776 +0000 UTC m=+0.073231843 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, io.buildah.version=1.41.4)
Sep 30 07:43:38 compute-0 podman[224081]: 2025-09-30 07:43:38.521612332 +0000 UTC m=+0.090474548 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 07:43:38 compute-0 podman[224083]: 2025-09-30 07:43:38.529254101 +0000 UTC m=+0.100299910 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.build-date=20250930)
Sep 30 07:43:38 compute-0 nova_compute[189265]: 2025-09-30 07:43:38.557 2 INFO nova.virt.libvirt.driver [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 6e93e66e-2937-4004-b34f-3fe033cb2935] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Sep 30 07:43:38 compute-0 nova_compute[189265]: 2025-09-30 07:43:38.562 2 DEBUG nova.compute.manager [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 6e93e66e-2937-4004-b34f-3fe033cb2935] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 07:43:39 compute-0 nova_compute[189265]: 2025-09-30 07:43:39.074 2 DEBUG nova.objects.instance [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 6e93e66e-2937-4004-b34f-3fe033cb2935] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Sep 30 07:43:39 compute-0 nova_compute[189265]: 2025-09-30 07:43:39.418 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:40 compute-0 nova_compute[189265]: 2025-09-30 07:43:40.098 2 WARNING neutronclient.v2_0.client [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:43:40 compute-0 nova_compute[189265]: 2025-09-30 07:43:40.290 2 WARNING neutronclient.v2_0.client [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:43:40 compute-0 nova_compute[189265]: 2025-09-30 07:43:40.291 2 WARNING neutronclient.v2_0.client [None req-57530c1d-c972-4388-af02-6db17d5e5bed e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:43:42 compute-0 nova_compute[189265]: 2025-09-30 07:43:42.106 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:44 compute-0 nova_compute[189265]: 2025-09-30 07:43:44.109 2 DEBUG oslo_concurrency.lockutils [None req-cec45fe2-778c-4b17-b6dc-70127aa32e9e 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Acquiring lock "6616fa8c-6043-4809-970f-befa571a47bf" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:43:44 compute-0 nova_compute[189265]: 2025-09-30 07:43:44.109 2 DEBUG oslo_concurrency.lockutils [None req-cec45fe2-778c-4b17-b6dc-70127aa32e9e 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Lock "6616fa8c-6043-4809-970f-befa571a47bf" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:43:44 compute-0 nova_compute[189265]: 2025-09-30 07:43:44.110 2 DEBUG oslo_concurrency.lockutils [None req-cec45fe2-778c-4b17-b6dc-70127aa32e9e 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Acquiring lock "6616fa8c-6043-4809-970f-befa571a47bf-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:43:44 compute-0 nova_compute[189265]: 2025-09-30 07:43:44.110 2 DEBUG oslo_concurrency.lockutils [None req-cec45fe2-778c-4b17-b6dc-70127aa32e9e 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Lock "6616fa8c-6043-4809-970f-befa571a47bf-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:43:44 compute-0 nova_compute[189265]: 2025-09-30 07:43:44.111 2 DEBUG oslo_concurrency.lockutils [None req-cec45fe2-778c-4b17-b6dc-70127aa32e9e 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Lock "6616fa8c-6043-4809-970f-befa571a47bf-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:43:44 compute-0 nova_compute[189265]: 2025-09-30 07:43:44.127 2 INFO nova.compute.manager [None req-cec45fe2-778c-4b17-b6dc-70127aa32e9e 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Terminating instance
Sep 30 07:43:44 compute-0 nova_compute[189265]: 2025-09-30 07:43:44.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:44 compute-0 nova_compute[189265]: 2025-09-30 07:43:44.646 2 DEBUG nova.compute.manager [None req-cec45fe2-778c-4b17-b6dc-70127aa32e9e 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 07:43:44 compute-0 kernel: tap103df688-88 (unregistering): left promiscuous mode
Sep 30 07:43:44 compute-0 NetworkManager[51813]: <info>  [1759218224.6714] device (tap103df688-88): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 07:43:44 compute-0 ovn_controller[91436]: 2025-09-30T07:43:44Z|00264|binding|INFO|Releasing lport 103df688-88b9-4fd1-98fe-1f1b8db21a1d from this chassis (sb_readonly=0)
Sep 30 07:43:44 compute-0 ovn_controller[91436]: 2025-09-30T07:43:44Z|00265|binding|INFO|Setting lport 103df688-88b9-4fd1-98fe-1f1b8db21a1d down in Southbound
Sep 30 07:43:44 compute-0 ovn_controller[91436]: 2025-09-30T07:43:44Z|00266|binding|INFO|Removing iface tap103df688-88 ovn-installed in OVS
Sep 30 07:43:44 compute-0 nova_compute[189265]: 2025-09-30 07:43:44.687 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:44 compute-0 nova_compute[189265]: 2025-09-30 07:43:44.691 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:43:44.711 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b7:a8:8b 10.100.0.5'], port_security=['fa:16:3e:b7:a8:8b 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '6616fa8c-6043-4809-970f-befa571a47bf', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c37ccab9-b2b3-4600-9cd6-fc38d618b79f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'eb449ca8f36d45d88d1ef08bcb192ca6', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'd8d8aea8-dfd7-4028-b831-7a1c1bc3f21e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3a63f614-d134-4922-9cae-18c0918b6eb4, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], logical_port=103df688-88b9-4fd1-98fe-1f1b8db21a1d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:43:44 compute-0 nova_compute[189265]: 2025-09-30 07:43:44.712 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:43:44.714 100322 INFO neutron.agent.ovn.metadata.agent [-] Port 103df688-88b9-4fd1-98fe-1f1b8db21a1d in datapath c37ccab9-b2b3-4600-9cd6-fc38d618b79f unbound from our chassis
Sep 30 07:43:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:43:44.717 100322 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c37ccab9-b2b3-4600-9cd6-fc38d618b79f
Sep 30 07:43:44 compute-0 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000001b.scope: Deactivated successfully.
Sep 30 07:43:44 compute-0 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000001b.scope: Consumed 14.646s CPU time.
Sep 30 07:43:44 compute-0 systemd-machined[149233]: Machine qemu-21-instance-0000001b terminated.
Sep 30 07:43:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:43:44.744 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[519d5e52-4446-41ed-9864-4ea8517d5262]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:43:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:43:44.782 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[963dafcb-5190-4bf3-bb0d-2c5b409b3379]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:43:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:43:44.785 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[4e88e436-6278-4bcd-a82c-36d8af8358e4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:43:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:43:44.821 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[2e2e9acf-d920-4b8c-89ff-97a26c32a865]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:43:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:43:44.841 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[0b699723-cfc8-4e87-8057-663ef143729f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc37ccab9-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:87:8a:ba'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 64], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600923, 'reachable_time': 28999, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224157, 'error': None, 'target': 'ovnmeta-c37ccab9-b2b3-4600-9cd6-fc38d618b79f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:43:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:43:44.864 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[7411a3a8-8b7a-4068-8aeb-e35addd24908]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapc37ccab9-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 600935, 'tstamp': 600935}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224158, 'error': None, 'target': 'ovnmeta-c37ccab9-b2b3-4600-9cd6-fc38d618b79f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapc37ccab9-b1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 600938, 'tstamp': 600938}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224158, 'error': None, 'target': 'ovnmeta-c37ccab9-b2b3-4600-9cd6-fc38d618b79f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:43:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:43:44.865 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc37ccab9-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:43:44 compute-0 nova_compute[189265]: 2025-09-30 07:43:44.867 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:44 compute-0 nova_compute[189265]: 2025-09-30 07:43:44.874 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:43:44.875 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc37ccab9-b0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:43:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:43:44.875 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:43:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:43:44.876 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc37ccab9-b0, col_values=(('external_ids', {'iface-id': '62cc21a6-25dd-4a68-bbc0-05c4bab51f8a'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:43:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:43:44.877 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:43:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:43:44.879 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[e206d7df-d0bd-4b6e-b05f-bc52261f1972]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-c37ccab9-b2b3-4600-9cd6-fc38d618b79f\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/c37ccab9-b2b3-4600-9cd6-fc38d618b79f.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID c37ccab9-b2b3-4600-9cd6-fc38d618b79f\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:43:44 compute-0 nova_compute[189265]: 2025-09-30 07:43:44.935 2 INFO nova.virt.libvirt.driver [-] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Instance destroyed successfully.
Sep 30 07:43:44 compute-0 nova_compute[189265]: 2025-09-30 07:43:44.935 2 DEBUG nova.objects.instance [None req-cec45fe2-778c-4b17-b6dc-70127aa32e9e 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Lazy-loading 'resources' on Instance uuid 6616fa8c-6043-4809-970f-befa571a47bf obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:43:45 compute-0 nova_compute[189265]: 2025-09-30 07:43:45.166 2 DEBUG nova.compute.manager [req-6f114735-bb6d-4fcd-87c6-80754f8afe25 req-c0ac6dc0-9eec-42f5-804f-16ef19ff7eef 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Received event network-vif-unplugged-103df688-88b9-4fd1-98fe-1f1b8db21a1d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:43:45 compute-0 nova_compute[189265]: 2025-09-30 07:43:45.166 2 DEBUG oslo_concurrency.lockutils [req-6f114735-bb6d-4fcd-87c6-80754f8afe25 req-c0ac6dc0-9eec-42f5-804f-16ef19ff7eef 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "6616fa8c-6043-4809-970f-befa571a47bf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:43:45 compute-0 nova_compute[189265]: 2025-09-30 07:43:45.167 2 DEBUG oslo_concurrency.lockutils [req-6f114735-bb6d-4fcd-87c6-80754f8afe25 req-c0ac6dc0-9eec-42f5-804f-16ef19ff7eef 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "6616fa8c-6043-4809-970f-befa571a47bf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:43:45 compute-0 nova_compute[189265]: 2025-09-30 07:43:45.167 2 DEBUG oslo_concurrency.lockutils [req-6f114735-bb6d-4fcd-87c6-80754f8afe25 req-c0ac6dc0-9eec-42f5-804f-16ef19ff7eef 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "6616fa8c-6043-4809-970f-befa571a47bf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:43:45 compute-0 nova_compute[189265]: 2025-09-30 07:43:45.168 2 DEBUG nova.compute.manager [req-6f114735-bb6d-4fcd-87c6-80754f8afe25 req-c0ac6dc0-9eec-42f5-804f-16ef19ff7eef 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] No waiting events found dispatching network-vif-unplugged-103df688-88b9-4fd1-98fe-1f1b8db21a1d pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:43:45 compute-0 nova_compute[189265]: 2025-09-30 07:43:45.168 2 DEBUG nova.compute.manager [req-6f114735-bb6d-4fcd-87c6-80754f8afe25 req-c0ac6dc0-9eec-42f5-804f-16ef19ff7eef 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Received event network-vif-unplugged-103df688-88b9-4fd1-98fe-1f1b8db21a1d for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:43:45 compute-0 nova_compute[189265]: 2025-09-30 07:43:45.443 2 DEBUG nova.virt.libvirt.vif [None req-cec45fe2-778c-4b17-b6dc-70127aa32e9e 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-09-30T07:42:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-345872352',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-345872352',id=27,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T07:42:33Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='eb449ca8f36d45d88d1ef08bcb192ca6',ramdisk_id='',reservation_id='r-s4id5kua',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-264644006',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-264644006-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T07:42:33Z,user_data=None,user_id='5c02a0a41ab14f6a92e1e6e2798736ae',uuid=6616fa8c-6043-4809-970f-befa571a47bf,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "103df688-88b9-4fd1-98fe-1f1b8db21a1d", "address": "fa:16:3e:b7:a8:8b", "network": {"id": "c37ccab9-b2b3-4600-9cd6-fc38d618b79f", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1008995657-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75afc4c4c3cd416898ef46cd7b7e99de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap103df688-88", "ovs_interfaceid": "103df688-88b9-4fd1-98fe-1f1b8db21a1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 07:43:45 compute-0 nova_compute[189265]: 2025-09-30 07:43:45.443 2 DEBUG nova.network.os_vif_util [None req-cec45fe2-778c-4b17-b6dc-70127aa32e9e 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Converting VIF {"id": "103df688-88b9-4fd1-98fe-1f1b8db21a1d", "address": "fa:16:3e:b7:a8:8b", "network": {"id": "c37ccab9-b2b3-4600-9cd6-fc38d618b79f", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1008995657-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75afc4c4c3cd416898ef46cd7b7e99de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap103df688-88", "ovs_interfaceid": "103df688-88b9-4fd1-98fe-1f1b8db21a1d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:43:45 compute-0 nova_compute[189265]: 2025-09-30 07:43:45.444 2 DEBUG nova.network.os_vif_util [None req-cec45fe2-778c-4b17-b6dc-70127aa32e9e 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b7:a8:8b,bridge_name='br-int',has_traffic_filtering=True,id=103df688-88b9-4fd1-98fe-1f1b8db21a1d,network=Network(c37ccab9-b2b3-4600-9cd6-fc38d618b79f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap103df688-88') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:43:45 compute-0 nova_compute[189265]: 2025-09-30 07:43:45.445 2 DEBUG os_vif [None req-cec45fe2-778c-4b17-b6dc-70127aa32e9e 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b7:a8:8b,bridge_name='br-int',has_traffic_filtering=True,id=103df688-88b9-4fd1-98fe-1f1b8db21a1d,network=Network(c37ccab9-b2b3-4600-9cd6-fc38d618b79f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap103df688-88') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 07:43:45 compute-0 nova_compute[189265]: 2025-09-30 07:43:45.446 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:45 compute-0 nova_compute[189265]: 2025-09-30 07:43:45.447 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap103df688-88, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:43:45 compute-0 nova_compute[189265]: 2025-09-30 07:43:45.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:43:45 compute-0 nova_compute[189265]: 2025-09-30 07:43:45.453 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:45 compute-0 nova_compute[189265]: 2025-09-30 07:43:45.454 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=0ee2f6f4-7ad2-4f81-b13f-c63c35d618da) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:43:45 compute-0 nova_compute[189265]: 2025-09-30 07:43:45.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:45 compute-0 nova_compute[189265]: 2025-09-30 07:43:45.456 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:45 compute-0 nova_compute[189265]: 2025-09-30 07:43:45.459 2 INFO os_vif [None req-cec45fe2-778c-4b17-b6dc-70127aa32e9e 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b7:a8:8b,bridge_name='br-int',has_traffic_filtering=True,id=103df688-88b9-4fd1-98fe-1f1b8db21a1d,network=Network(c37ccab9-b2b3-4600-9cd6-fc38d618b79f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap103df688-88')
Sep 30 07:43:45 compute-0 nova_compute[189265]: 2025-09-30 07:43:45.460 2 INFO nova.virt.libvirt.driver [None req-cec45fe2-778c-4b17-b6dc-70127aa32e9e 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Deleting instance files /var/lib/nova/instances/6616fa8c-6043-4809-970f-befa571a47bf_del
Sep 30 07:43:45 compute-0 nova_compute[189265]: 2025-09-30 07:43:45.461 2 INFO nova.virt.libvirt.driver [None req-cec45fe2-778c-4b17-b6dc-70127aa32e9e 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Deletion of /var/lib/nova/instances/6616fa8c-6043-4809-970f-befa571a47bf_del complete
Sep 30 07:43:45 compute-0 nova_compute[189265]: 2025-09-30 07:43:45.979 2 INFO nova.compute.manager [None req-cec45fe2-778c-4b17-b6dc-70127aa32e9e 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Took 1.33 seconds to destroy the instance on the hypervisor.
Sep 30 07:43:45 compute-0 nova_compute[189265]: 2025-09-30 07:43:45.979 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-cec45fe2-778c-4b17-b6dc-70127aa32e9e 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 07:43:45 compute-0 nova_compute[189265]: 2025-09-30 07:43:45.980 2 DEBUG nova.compute.manager [-] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 07:43:45 compute-0 nova_compute[189265]: 2025-09-30 07:43:45.980 2 DEBUG nova.network.neutron [-] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 07:43:45 compute-0 nova_compute[189265]: 2025-09-30 07:43:45.981 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:43:47 compute-0 nova_compute[189265]: 2025-09-30 07:43:47.055 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:43:47 compute-0 nova_compute[189265]: 2025-09-30 07:43:47.255 2 DEBUG nova.compute.manager [req-5a25c0ac-d662-44ef-89b6-f3f16ac54fd5 req-ffcaf371-283d-493e-8ba9-e8dffa3c3f73 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Received event network-vif-unplugged-103df688-88b9-4fd1-98fe-1f1b8db21a1d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:43:47 compute-0 nova_compute[189265]: 2025-09-30 07:43:47.256 2 DEBUG oslo_concurrency.lockutils [req-5a25c0ac-d662-44ef-89b6-f3f16ac54fd5 req-ffcaf371-283d-493e-8ba9-e8dffa3c3f73 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "6616fa8c-6043-4809-970f-befa571a47bf-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:43:47 compute-0 nova_compute[189265]: 2025-09-30 07:43:47.256 2 DEBUG oslo_concurrency.lockutils [req-5a25c0ac-d662-44ef-89b6-f3f16ac54fd5 req-ffcaf371-283d-493e-8ba9-e8dffa3c3f73 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "6616fa8c-6043-4809-970f-befa571a47bf-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:43:47 compute-0 nova_compute[189265]: 2025-09-30 07:43:47.257 2 DEBUG oslo_concurrency.lockutils [req-5a25c0ac-d662-44ef-89b6-f3f16ac54fd5 req-ffcaf371-283d-493e-8ba9-e8dffa3c3f73 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "6616fa8c-6043-4809-970f-befa571a47bf-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:43:47 compute-0 nova_compute[189265]: 2025-09-30 07:43:47.257 2 DEBUG nova.compute.manager [req-5a25c0ac-d662-44ef-89b6-f3f16ac54fd5 req-ffcaf371-283d-493e-8ba9-e8dffa3c3f73 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] No waiting events found dispatching network-vif-unplugged-103df688-88b9-4fd1-98fe-1f1b8db21a1d pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:43:47 compute-0 nova_compute[189265]: 2025-09-30 07:43:47.257 2 DEBUG nova.compute.manager [req-5a25c0ac-d662-44ef-89b6-f3f16ac54fd5 req-ffcaf371-283d-493e-8ba9-e8dffa3c3f73 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Received event network-vif-unplugged-103df688-88b9-4fd1-98fe-1f1b8db21a1d for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:43:48 compute-0 nova_compute[189265]: 2025-09-30 07:43:48.177 2 DEBUG nova.network.neutron [-] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:43:48 compute-0 nova_compute[189265]: 2025-09-30 07:43:48.685 2 INFO nova.compute.manager [-] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Took 2.70 seconds to deallocate network for instance.
Sep 30 07:43:49 compute-0 nova_compute[189265]: 2025-09-30 07:43:49.212 2 DEBUG oslo_concurrency.lockutils [None req-cec45fe2-778c-4b17-b6dc-70127aa32e9e 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:43:49 compute-0 nova_compute[189265]: 2025-09-30 07:43:49.213 2 DEBUG oslo_concurrency.lockutils [None req-cec45fe2-778c-4b17-b6dc-70127aa32e9e 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:43:49 compute-0 nova_compute[189265]: 2025-09-30 07:43:49.312 2 DEBUG nova.compute.provider_tree [None req-cec45fe2-778c-4b17-b6dc-70127aa32e9e 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:43:49 compute-0 nova_compute[189265]: 2025-09-30 07:43:49.335 2 DEBUG nova.compute.manager [req-e2a8b36c-b448-4d2d-85e8-1ad9bdd2ed2e req-90e0f5ee-59f6-4571-b276-9f415732626d 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 6616fa8c-6043-4809-970f-befa571a47bf] Received event network-vif-deleted-103df688-88b9-4fd1-98fe-1f1b8db21a1d external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:43:49 compute-0 nova_compute[189265]: 2025-09-30 07:43:49.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:49 compute-0 nova_compute[189265]: 2025-09-30 07:43:49.821 2 DEBUG nova.scheduler.client.report [None req-cec45fe2-778c-4b17-b6dc-70127aa32e9e 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:43:50 compute-0 nova_compute[189265]: 2025-09-30 07:43:50.329 2 DEBUG oslo_concurrency.lockutils [None req-cec45fe2-778c-4b17-b6dc-70127aa32e9e 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.116s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:43:50 compute-0 nova_compute[189265]: 2025-09-30 07:43:50.363 2 INFO nova.scheduler.client.report [None req-cec45fe2-778c-4b17-b6dc-70127aa32e9e 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Deleted allocations for instance 6616fa8c-6043-4809-970f-befa571a47bf
Sep 30 07:43:50 compute-0 nova_compute[189265]: 2025-09-30 07:43:50.456 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:51 compute-0 nova_compute[189265]: 2025-09-30 07:43:51.408 2 DEBUG oslo_concurrency.lockutils [None req-cec45fe2-778c-4b17-b6dc-70127aa32e9e 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Lock "6616fa8c-6043-4809-970f-befa571a47bf" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.298s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:43:51 compute-0 nova_compute[189265]: 2025-09-30 07:43:51.874 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:43:52 compute-0 nova_compute[189265]: 2025-09-30 07:43:52.165 2 DEBUG oslo_concurrency.lockutils [None req-6ca7d715-f57c-4226-bd6a-1fb0c6e37f96 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Acquiring lock "6e93e66e-2937-4004-b34f-3fe033cb2935" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:43:52 compute-0 nova_compute[189265]: 2025-09-30 07:43:52.166 2 DEBUG oslo_concurrency.lockutils [None req-6ca7d715-f57c-4226-bd6a-1fb0c6e37f96 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Lock "6e93e66e-2937-4004-b34f-3fe033cb2935" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:43:52 compute-0 nova_compute[189265]: 2025-09-30 07:43:52.166 2 DEBUG oslo_concurrency.lockutils [None req-6ca7d715-f57c-4226-bd6a-1fb0c6e37f96 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Acquiring lock "6e93e66e-2937-4004-b34f-3fe033cb2935-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:43:52 compute-0 nova_compute[189265]: 2025-09-30 07:43:52.166 2 DEBUG oslo_concurrency.lockutils [None req-6ca7d715-f57c-4226-bd6a-1fb0c6e37f96 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Lock "6e93e66e-2937-4004-b34f-3fe033cb2935-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:43:52 compute-0 nova_compute[189265]: 2025-09-30 07:43:52.168 2 DEBUG oslo_concurrency.lockutils [None req-6ca7d715-f57c-4226-bd6a-1fb0c6e37f96 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Lock "6e93e66e-2937-4004-b34f-3fe033cb2935-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:43:52 compute-0 nova_compute[189265]: 2025-09-30 07:43:52.184 2 INFO nova.compute.manager [None req-6ca7d715-f57c-4226-bd6a-1fb0c6e37f96 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] [instance: 6e93e66e-2937-4004-b34f-3fe033cb2935] Terminating instance
Sep 30 07:43:52 compute-0 nova_compute[189265]: 2025-09-30 07:43:52.704 2 DEBUG nova.compute.manager [None req-6ca7d715-f57c-4226-bd6a-1fb0c6e37f96 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] [instance: 6e93e66e-2937-4004-b34f-3fe033cb2935] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 07:43:52 compute-0 kernel: tap4d31ed33-b1 (unregistering): left promiscuous mode
Sep 30 07:43:52 compute-0 NetworkManager[51813]: <info>  [1759218232.7329] device (tap4d31ed33-b1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 07:43:52 compute-0 ovn_controller[91436]: 2025-09-30T07:43:52Z|00267|binding|INFO|Releasing lport 4d31ed33-b19b-46b2-80ed-c3276286d105 from this chassis (sb_readonly=0)
Sep 30 07:43:52 compute-0 ovn_controller[91436]: 2025-09-30T07:43:52Z|00268|binding|INFO|Setting lport 4d31ed33-b19b-46b2-80ed-c3276286d105 down in Southbound
Sep 30 07:43:52 compute-0 nova_compute[189265]: 2025-09-30 07:43:52.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:52 compute-0 ovn_controller[91436]: 2025-09-30T07:43:52Z|00269|binding|INFO|Removing iface tap4d31ed33-b1 ovn-installed in OVS
Sep 30 07:43:52 compute-0 nova_compute[189265]: 2025-09-30 07:43:52.746 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:52 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:43:52.751 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:95:74:6c 10.100.0.3'], port_security=['fa:16:3e:95:74:6c 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '6e93e66e-2937-4004-b34f-3fe033cb2935', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c37ccab9-b2b3-4600-9cd6-fc38d618b79f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'eb449ca8f36d45d88d1ef08bcb192ca6', 'neutron:revision_number': '14', 'neutron:security_group_ids': 'd8d8aea8-dfd7-4028-b831-7a1c1bc3f21e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3a63f614-d134-4922-9cae-18c0918b6eb4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], logical_port=4d31ed33-b19b-46b2-80ed-c3276286d105) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:43:52 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:43:52.753 100322 INFO neutron.agent.ovn.metadata.agent [-] Port 4d31ed33-b19b-46b2-80ed-c3276286d105 in datapath c37ccab9-b2b3-4600-9cd6-fc38d618b79f unbound from our chassis
Sep 30 07:43:52 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:43:52.755 100322 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c37ccab9-b2b3-4600-9cd6-fc38d618b79f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 07:43:52 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:43:52.756 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[ec757bc8-d9be-4c81-9cc3-b12364fe40d3]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:43:52 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:43:52.758 100322 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c37ccab9-b2b3-4600-9cd6-fc38d618b79f namespace which is not needed anymore
Sep 30 07:43:52 compute-0 nova_compute[189265]: 2025-09-30 07:43:52.777 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:52 compute-0 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d0000001a.scope: Deactivated successfully.
Sep 30 07:43:52 compute-0 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d0000001a.scope: Consumed 2.821s CPU time.
Sep 30 07:43:52 compute-0 systemd-machined[149233]: Machine qemu-22-instance-0000001a terminated.
Sep 30 07:43:52 compute-0 podman[224178]: 2025-09-30 07:43:52.88618187 +0000 UTC m=+0.111175342 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 07:43:52 compute-0 neutron-haproxy-ovnmeta-c37ccab9-b2b3-4600-9cd6-fc38d618b79f[223671]: [NOTICE]   (223675) : haproxy version is 3.0.5-8e879a5
Sep 30 07:43:52 compute-0 neutron-haproxy-ovnmeta-c37ccab9-b2b3-4600-9cd6-fc38d618b79f[223671]: [NOTICE]   (223675) : path to executable is /usr/sbin/haproxy
Sep 30 07:43:52 compute-0 neutron-haproxy-ovnmeta-c37ccab9-b2b3-4600-9cd6-fc38d618b79f[223671]: [WARNING]  (223675) : Exiting Master process...
Sep 30 07:43:52 compute-0 podman[224220]: 2025-09-30 07:43:52.933122887 +0000 UTC m=+0.047798103 container kill 971cf3f62cd4f68378814c4133ce4cc2f47af5011357a85aa85b272f4f75b337 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c37ccab9-b2b3-4600-9cd6-fc38d618b79f, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 07:43:52 compute-0 neutron-haproxy-ovnmeta-c37ccab9-b2b3-4600-9cd6-fc38d618b79f[223671]: [ALERT]    (223675) : Current worker (223677) exited with code 143 (Terminated)
Sep 30 07:43:52 compute-0 neutron-haproxy-ovnmeta-c37ccab9-b2b3-4600-9cd6-fc38d618b79f[223671]: [WARNING]  (223675) : All workers exited. Exiting... (0)
Sep 30 07:43:52 compute-0 systemd[1]: libpod-971cf3f62cd4f68378814c4133ce4cc2f47af5011357a85aa85b272f4f75b337.scope: Deactivated successfully.
Sep 30 07:43:52 compute-0 nova_compute[189265]: 2025-09-30 07:43:52.952 2 DEBUG nova.compute.manager [req-178c509a-10e0-4dc6-9cc9-ddbe817003e4 req-873dd964-490f-4062-a378-2f931051d964 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 6e93e66e-2937-4004-b34f-3fe033cb2935] Received event network-vif-unplugged-4d31ed33-b19b-46b2-80ed-c3276286d105 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:43:52 compute-0 nova_compute[189265]: 2025-09-30 07:43:52.952 2 DEBUG oslo_concurrency.lockutils [req-178c509a-10e0-4dc6-9cc9-ddbe817003e4 req-873dd964-490f-4062-a378-2f931051d964 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "6e93e66e-2937-4004-b34f-3fe033cb2935-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:43:52 compute-0 nova_compute[189265]: 2025-09-30 07:43:52.952 2 DEBUG oslo_concurrency.lockutils [req-178c509a-10e0-4dc6-9cc9-ddbe817003e4 req-873dd964-490f-4062-a378-2f931051d964 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "6e93e66e-2937-4004-b34f-3fe033cb2935-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:43:52 compute-0 nova_compute[189265]: 2025-09-30 07:43:52.953 2 DEBUG oslo_concurrency.lockutils [req-178c509a-10e0-4dc6-9cc9-ddbe817003e4 req-873dd964-490f-4062-a378-2f931051d964 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "6e93e66e-2937-4004-b34f-3fe033cb2935-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:43:52 compute-0 nova_compute[189265]: 2025-09-30 07:43:52.953 2 DEBUG nova.compute.manager [req-178c509a-10e0-4dc6-9cc9-ddbe817003e4 req-873dd964-490f-4062-a378-2f931051d964 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 6e93e66e-2937-4004-b34f-3fe033cb2935] No waiting events found dispatching network-vif-unplugged-4d31ed33-b19b-46b2-80ed-c3276286d105 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:43:52 compute-0 nova_compute[189265]: 2025-09-30 07:43:52.953 2 DEBUG nova.compute.manager [req-178c509a-10e0-4dc6-9cc9-ddbe817003e4 req-873dd964-490f-4062-a378-2f931051d964 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 6e93e66e-2937-4004-b34f-3fe033cb2935] Received event network-vif-unplugged-4d31ed33-b19b-46b2-80ed-c3276286d105 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:43:52 compute-0 nova_compute[189265]: 2025-09-30 07:43:52.969 2 INFO nova.virt.libvirt.driver [-] [instance: 6e93e66e-2937-4004-b34f-3fe033cb2935] Instance destroyed successfully.
Sep 30 07:43:52 compute-0 nova_compute[189265]: 2025-09-30 07:43:52.969 2 DEBUG nova.objects.instance [None req-6ca7d715-f57c-4226-bd6a-1fb0c6e37f96 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Lazy-loading 'resources' on Instance uuid 6e93e66e-2937-4004-b34f-3fe033cb2935 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:43:52 compute-0 podman[224246]: 2025-09-30 07:43:52.9812898 +0000 UTC m=+0.029850968 container died 971cf3f62cd4f68378814c4133ce4cc2f47af5011357a85aa85b272f4f75b337 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c37ccab9-b2b3-4600-9cd6-fc38d618b79f, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Sep 30 07:43:53 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-971cf3f62cd4f68378814c4133ce4cc2f47af5011357a85aa85b272f4f75b337-userdata-shm.mount: Deactivated successfully.
Sep 30 07:43:53 compute-0 systemd[1]: var-lib-containers-storage-overlay-4b8c8311359e8cbbe1370ab8c5db668a2a209a147dc25e549d690826104fa59b-merged.mount: Deactivated successfully.
Sep 30 07:43:53 compute-0 podman[224246]: 2025-09-30 07:43:53.025968962 +0000 UTC m=+0.074530030 container cleanup 971cf3f62cd4f68378814c4133ce4cc2f47af5011357a85aa85b272f4f75b337 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c37ccab9-b2b3-4600-9cd6-fc38d618b79f, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 07:43:53 compute-0 systemd[1]: libpod-conmon-971cf3f62cd4f68378814c4133ce4cc2f47af5011357a85aa85b272f4f75b337.scope: Deactivated successfully.
Sep 30 07:43:53 compute-0 podman[224258]: 2025-09-30 07:43:53.042779525 +0000 UTC m=+0.067305623 container remove 971cf3f62cd4f68378814c4133ce4cc2f47af5011357a85aa85b272f4f75b337 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-c37ccab9-b2b3-4600-9cd6-fc38d618b79f, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Sep 30 07:43:53 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:43:53.049 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[8da95df3-d208-4ea4-983d-de96ee7a8a17]: (4, ("Tue Sep 30 07:43:52 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-c37ccab9-b2b3-4600-9cd6-fc38d618b79f (971cf3f62cd4f68378814c4133ce4cc2f47af5011357a85aa85b272f4f75b337)\n971cf3f62cd4f68378814c4133ce4cc2f47af5011357a85aa85b272f4f75b337\nTue Sep 30 07:43:52 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c37ccab9-b2b3-4600-9cd6-fc38d618b79f (971cf3f62cd4f68378814c4133ce4cc2f47af5011357a85aa85b272f4f75b337)\n971cf3f62cd4f68378814c4133ce4cc2f47af5011357a85aa85b272f4f75b337\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:43:53 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:43:53.050 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[bb5986f3-8a48-4ef3-a95f-9c6a0a4c5e55]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:43:53 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:43:53.051 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c37ccab9-b2b3-4600-9cd6-fc38d618b79f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c37ccab9-b2b3-4600-9cd6-fc38d618b79f.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:43:53 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:43:53.051 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[deefd4d2-56fc-43df-8297-14b4b04bcb61]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:43:53 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:43:53.052 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc37ccab9-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:43:53 compute-0 nova_compute[189265]: 2025-09-30 07:43:53.113 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:53 compute-0 kernel: tapc37ccab9-b0: left promiscuous mode
Sep 30 07:43:53 compute-0 nova_compute[189265]: 2025-09-30 07:43:53.127 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:53 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:43:53.131 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[9ff8ac17-19f4-43c4-83ac-9c4d5c84a4a6]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:43:53 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:43:53.161 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[1dfc07cf-8479-4fc0-99a0-8285bcf3a4c9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:43:53 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:43:53.162 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[1a181208-5922-4053-9e0b-314257dc3447]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:43:53 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:43:53.185 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[4d4e82e3-36c6-4790-abfc-dbc5f0179965]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 600916, 'reachable_time': 33605, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224290, 'error': None, 'target': 'ovnmeta-c37ccab9-b2b3-4600-9cd6-fc38d618b79f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:43:53 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:43:53.187 100440 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c37ccab9-b2b3-4600-9cd6-fc38d618b79f deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Sep 30 07:43:53 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:43:53.188 100440 DEBUG oslo.privsep.daemon [-] privsep: reply[13210d4f-7710-4da3-ad8b-c5a9860908af]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:43:53 compute-0 systemd[1]: run-netns-ovnmeta\x2dc37ccab9\x2db2b3\x2d4600\x2d9cd6\x2dfc38d618b79f.mount: Deactivated successfully.
Sep 30 07:43:53 compute-0 nova_compute[189265]: 2025-09-30 07:43:53.478 2 DEBUG nova.virt.libvirt.vif [None req-6ca7d715-f57c-4226-bd6a-1fb0c6e37f96 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-09-30T07:41:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteVmWorkloadBalanceStrategy-server-2113466578',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutevmworkloadbalancestrategy-server-2113466578',id=26,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T07:42:10Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='eb449ca8f36d45d88d1ef08bcb192ca6',ramdisk_id='',reservation_id='r-5v2dyfq0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',clean_attempts='1',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteVmWorkloadBalanceStrategy-264644006',owner_user_name='tempest-TestExecuteVmWorkloadBalanceStrategy-264644006-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T07:43:39Z,user_data=None,user_id='5c02a0a41ab14f6a92e1e6e2798736ae',uuid=6e93e66e-2937-4004-b34f-3fe033cb2935,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4d31ed33-b19b-46b2-80ed-c3276286d105", "address": "fa:16:3e:95:74:6c", "network": {"id": "c37ccab9-b2b3-4600-9cd6-fc38d618b79f", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1008995657-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75afc4c4c3cd416898ef46cd7b7e99de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d31ed33-b1", "ovs_interfaceid": "4d31ed33-b19b-46b2-80ed-c3276286d105", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 07:43:53 compute-0 nova_compute[189265]: 2025-09-30 07:43:53.478 2 DEBUG nova.network.os_vif_util [None req-6ca7d715-f57c-4226-bd6a-1fb0c6e37f96 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Converting VIF {"id": "4d31ed33-b19b-46b2-80ed-c3276286d105", "address": "fa:16:3e:95:74:6c", "network": {"id": "c37ccab9-b2b3-4600-9cd6-fc38d618b79f", "bridge": "br-int", "label": "tempest-TestExecuteVmWorkloadBalanceStrategy-1008995657-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "75afc4c4c3cd416898ef46cd7b7e99de", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4d31ed33-b1", "ovs_interfaceid": "4d31ed33-b19b-46b2-80ed-c3276286d105", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:43:53 compute-0 nova_compute[189265]: 2025-09-30 07:43:53.479 2 DEBUG nova.network.os_vif_util [None req-6ca7d715-f57c-4226-bd6a-1fb0c6e37f96 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:95:74:6c,bridge_name='br-int',has_traffic_filtering=True,id=4d31ed33-b19b-46b2-80ed-c3276286d105,network=Network(c37ccab9-b2b3-4600-9cd6-fc38d618b79f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d31ed33-b1') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:43:53 compute-0 nova_compute[189265]: 2025-09-30 07:43:53.480 2 DEBUG os_vif [None req-6ca7d715-f57c-4226-bd6a-1fb0c6e37f96 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:95:74:6c,bridge_name='br-int',has_traffic_filtering=True,id=4d31ed33-b19b-46b2-80ed-c3276286d105,network=Network(c37ccab9-b2b3-4600-9cd6-fc38d618b79f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d31ed33-b1') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 07:43:53 compute-0 nova_compute[189265]: 2025-09-30 07:43:53.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:53 compute-0 nova_compute[189265]: 2025-09-30 07:43:53.482 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4d31ed33-b1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:43:53 compute-0 nova_compute[189265]: 2025-09-30 07:43:53.484 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:53 compute-0 nova_compute[189265]: 2025-09-30 07:43:53.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:53 compute-0 nova_compute[189265]: 2025-09-30 07:43:53.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:53 compute-0 nova_compute[189265]: 2025-09-30 07:43:53.487 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=ef223baf-7233-4d96-ac01-72eab59443ca) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:43:53 compute-0 nova_compute[189265]: 2025-09-30 07:43:53.487 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:53 compute-0 nova_compute[189265]: 2025-09-30 07:43:53.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:53 compute-0 nova_compute[189265]: 2025-09-30 07:43:53.492 2 INFO os_vif [None req-6ca7d715-f57c-4226-bd6a-1fb0c6e37f96 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:95:74:6c,bridge_name='br-int',has_traffic_filtering=True,id=4d31ed33-b19b-46b2-80ed-c3276286d105,network=Network(c37ccab9-b2b3-4600-9cd6-fc38d618b79f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4d31ed33-b1')
Sep 30 07:43:53 compute-0 nova_compute[189265]: 2025-09-30 07:43:53.492 2 INFO nova.virt.libvirt.driver [None req-6ca7d715-f57c-4226-bd6a-1fb0c6e37f96 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] [instance: 6e93e66e-2937-4004-b34f-3fe033cb2935] Deleting instance files /var/lib/nova/instances/6e93e66e-2937-4004-b34f-3fe033cb2935_del
Sep 30 07:43:53 compute-0 nova_compute[189265]: 2025-09-30 07:43:53.493 2 INFO nova.virt.libvirt.driver [None req-6ca7d715-f57c-4226-bd6a-1fb0c6e37f96 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] [instance: 6e93e66e-2937-4004-b34f-3fe033cb2935] Deletion of /var/lib/nova/instances/6e93e66e-2937-4004-b34f-3fe033cb2935_del complete
Sep 30 07:43:54 compute-0 nova_compute[189265]: 2025-09-30 07:43:54.014 2 INFO nova.compute.manager [None req-6ca7d715-f57c-4226-bd6a-1fb0c6e37f96 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] [instance: 6e93e66e-2937-4004-b34f-3fe033cb2935] Took 1.31 seconds to destroy the instance on the hypervisor.
Sep 30 07:43:54 compute-0 nova_compute[189265]: 2025-09-30 07:43:54.015 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-6ca7d715-f57c-4226-bd6a-1fb0c6e37f96 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 07:43:54 compute-0 nova_compute[189265]: 2025-09-30 07:43:54.015 2 DEBUG nova.compute.manager [-] [instance: 6e93e66e-2937-4004-b34f-3fe033cb2935] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 07:43:54 compute-0 nova_compute[189265]: 2025-09-30 07:43:54.016 2 DEBUG nova.network.neutron [-] [instance: 6e93e66e-2937-4004-b34f-3fe033cb2935] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 07:43:54 compute-0 nova_compute[189265]: 2025-09-30 07:43:54.016 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:43:54 compute-0 nova_compute[189265]: 2025-09-30 07:43:54.290 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:43:54 compute-0 nova_compute[189265]: 2025-09-30 07:43:54.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:54 compute-0 nova_compute[189265]: 2025-09-30 07:43:54.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:43:55 compute-0 nova_compute[189265]: 2025-09-30 07:43:55.016 2 DEBUG nova.compute.manager [req-a9a32429-cb5d-4a02-806f-b0992d00e0ae req-e1abdd9f-1a7b-419a-a5ae-d29b341d8b45 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 6e93e66e-2937-4004-b34f-3fe033cb2935] Received event network-vif-unplugged-4d31ed33-b19b-46b2-80ed-c3276286d105 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:43:55 compute-0 nova_compute[189265]: 2025-09-30 07:43:55.017 2 DEBUG oslo_concurrency.lockutils [req-a9a32429-cb5d-4a02-806f-b0992d00e0ae req-e1abdd9f-1a7b-419a-a5ae-d29b341d8b45 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "6e93e66e-2937-4004-b34f-3fe033cb2935-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:43:55 compute-0 nova_compute[189265]: 2025-09-30 07:43:55.017 2 DEBUG oslo_concurrency.lockutils [req-a9a32429-cb5d-4a02-806f-b0992d00e0ae req-e1abdd9f-1a7b-419a-a5ae-d29b341d8b45 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "6e93e66e-2937-4004-b34f-3fe033cb2935-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:43:55 compute-0 nova_compute[189265]: 2025-09-30 07:43:55.018 2 DEBUG oslo_concurrency.lockutils [req-a9a32429-cb5d-4a02-806f-b0992d00e0ae req-e1abdd9f-1a7b-419a-a5ae-d29b341d8b45 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "6e93e66e-2937-4004-b34f-3fe033cb2935-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:43:55 compute-0 nova_compute[189265]: 2025-09-30 07:43:55.018 2 DEBUG nova.compute.manager [req-a9a32429-cb5d-4a02-806f-b0992d00e0ae req-e1abdd9f-1a7b-419a-a5ae-d29b341d8b45 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 6e93e66e-2937-4004-b34f-3fe033cb2935] No waiting events found dispatching network-vif-unplugged-4d31ed33-b19b-46b2-80ed-c3276286d105 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:43:55 compute-0 nova_compute[189265]: 2025-09-30 07:43:55.019 2 DEBUG nova.compute.manager [req-a9a32429-cb5d-4a02-806f-b0992d00e0ae req-e1abdd9f-1a7b-419a-a5ae-d29b341d8b45 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 6e93e66e-2937-4004-b34f-3fe033cb2935] Received event network-vif-unplugged-4d31ed33-b19b-46b2-80ed-c3276286d105 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:43:55 compute-0 nova_compute[189265]: 2025-09-30 07:43:55.019 2 DEBUG nova.compute.manager [req-a9a32429-cb5d-4a02-806f-b0992d00e0ae req-e1abdd9f-1a7b-419a-a5ae-d29b341d8b45 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 6e93e66e-2937-4004-b34f-3fe033cb2935] Received event network-vif-deleted-4d31ed33-b19b-46b2-80ed-c3276286d105 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:43:55 compute-0 nova_compute[189265]: 2025-09-30 07:43:55.020 2 INFO nova.compute.manager [req-a9a32429-cb5d-4a02-806f-b0992d00e0ae req-e1abdd9f-1a7b-419a-a5ae-d29b341d8b45 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 6e93e66e-2937-4004-b34f-3fe033cb2935] Neutron deleted interface 4d31ed33-b19b-46b2-80ed-c3276286d105; detaching it from the instance and deleting it from the info cache
Sep 30 07:43:55 compute-0 nova_compute[189265]: 2025-09-30 07:43:55.021 2 DEBUG nova.network.neutron [req-a9a32429-cb5d-4a02-806f-b0992d00e0ae req-e1abdd9f-1a7b-419a-a5ae-d29b341d8b45 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 6e93e66e-2937-4004-b34f-3fe033cb2935] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:43:55 compute-0 nova_compute[189265]: 2025-09-30 07:43:55.091 2 DEBUG nova.network.neutron [-] [instance: 6e93e66e-2937-4004-b34f-3fe033cb2935] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:43:55 compute-0 nova_compute[189265]: 2025-09-30 07:43:55.530 2 DEBUG nova.compute.manager [req-a9a32429-cb5d-4a02-806f-b0992d00e0ae req-e1abdd9f-1a7b-419a-a5ae-d29b341d8b45 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 6e93e66e-2937-4004-b34f-3fe033cb2935] Detach interface failed, port_id=4d31ed33-b19b-46b2-80ed-c3276286d105, reason: Instance 6e93e66e-2937-4004-b34f-3fe033cb2935 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Sep 30 07:43:55 compute-0 nova_compute[189265]: 2025-09-30 07:43:55.598 2 INFO nova.compute.manager [-] [instance: 6e93e66e-2937-4004-b34f-3fe033cb2935] Took 1.58 seconds to deallocate network for instance.
Sep 30 07:43:55 compute-0 nova_compute[189265]: 2025-09-30 07:43:55.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:43:56 compute-0 nova_compute[189265]: 2025-09-30 07:43:56.121 2 DEBUG oslo_concurrency.lockutils [None req-6ca7d715-f57c-4226-bd6a-1fb0c6e37f96 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:43:56 compute-0 nova_compute[189265]: 2025-09-30 07:43:56.121 2 DEBUG oslo_concurrency.lockutils [None req-6ca7d715-f57c-4226-bd6a-1fb0c6e37f96 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:43:56 compute-0 nova_compute[189265]: 2025-09-30 07:43:56.127 2 DEBUG oslo_concurrency.lockutils [None req-6ca7d715-f57c-4226-bd6a-1fb0c6e37f96 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.005s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:43:56 compute-0 nova_compute[189265]: 2025-09-30 07:43:56.155 2 INFO nova.scheduler.client.report [None req-6ca7d715-f57c-4226-bd6a-1fb0c6e37f96 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Deleted allocations for instance 6e93e66e-2937-4004-b34f-3fe033cb2935
Sep 30 07:43:56 compute-0 nova_compute[189265]: 2025-09-30 07:43:56.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:43:56 compute-0 nova_compute[189265]: 2025-09-30 07:43:56.788 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:43:57 compute-0 nova_compute[189265]: 2025-09-30 07:43:57.186 2 DEBUG oslo_concurrency.lockutils [None req-6ca7d715-f57c-4226-bd6a-1fb0c6e37f96 5c02a0a41ab14f6a92e1e6e2798736ae eb449ca8f36d45d88d1ef08bcb192ca6 - - default default] Lock "6e93e66e-2937-4004-b34f-3fe033cb2935" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.021s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:43:58 compute-0 nova_compute[189265]: 2025-09-30 07:43:58.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:59 compute-0 nova_compute[189265]: 2025-09-30 07:43:59.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:43:59 compute-0 podman[199733]: time="2025-09-30T07:43:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:43:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:43:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:43:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:43:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3011 "" "Go-http-client/1.1"
Sep 30 07:44:01 compute-0 openstack_network_exporter[201859]: ERROR   07:44:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:44:01 compute-0 openstack_network_exporter[201859]: ERROR   07:44:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:44:01 compute-0 openstack_network_exporter[201859]: ERROR   07:44:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:44:01 compute-0 openstack_network_exporter[201859]: ERROR   07:44:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:44:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:44:01 compute-0 openstack_network_exporter[201859]: ERROR   07:44:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:44:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:44:02 compute-0 podman[224292]: 2025-09-30 07:44:02.486188366 +0000 UTC m=+0.067858749 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Sep 30 07:44:02 compute-0 nova_compute[189265]: 2025-09-30 07:44:02.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:44:03 compute-0 nova_compute[189265]: 2025-09-30 07:44:03.490 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:44:03 compute-0 nova_compute[189265]: 2025-09-30 07:44:03.672 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:44:04 compute-0 nova_compute[189265]: 2025-09-30 07:44:04.490 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:44:05 compute-0 nova_compute[189265]: 2025-09-30 07:44:05.295 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:44:05 compute-0 nova_compute[189265]: 2025-09-30 07:44:05.296 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:44:05 compute-0 nova_compute[189265]: 2025-09-30 07:44:05.813 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:44:05 compute-0 nova_compute[189265]: 2025-09-30 07:44:05.814 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:44:05 compute-0 nova_compute[189265]: 2025-09-30 07:44:05.814 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:44:05 compute-0 nova_compute[189265]: 2025-09-30 07:44:05.815 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:44:05 compute-0 podman[224314]: 2025-09-30 07:44:05.919284195 +0000 UTC m=+0.063633127 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, build-date=2025-08-20T13:12:41, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, container_name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., architecture=x86_64, name=ubi9-minimal)
Sep 30 07:44:05 compute-0 nova_compute[189265]: 2025-09-30 07:44:05.945 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:44:05 compute-0 nova_compute[189265]: 2025-09-30 07:44:05.946 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:44:05 compute-0 nova_compute[189265]: 2025-09-30 07:44:05.961 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.016s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:44:05 compute-0 nova_compute[189265]: 2025-09-30 07:44:05.962 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5844MB free_disk=73.3036880493164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:44:05 compute-0 nova_compute[189265]: 2025-09-30 07:44:05.962 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:44:05 compute-0 nova_compute[189265]: 2025-09-30 07:44:05.962 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:44:07 compute-0 nova_compute[189265]: 2025-09-30 07:44:07.025 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:44:07 compute-0 nova_compute[189265]: 2025-09-30 07:44:07.025 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:44:05 up  1:41,  0 user,  load average: 0.54, 0.39, 0.32\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:44:07 compute-0 nova_compute[189265]: 2025-09-30 07:44:07.049 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:44:07 compute-0 nova_compute[189265]: 2025-09-30 07:44:07.555 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:44:08 compute-0 nova_compute[189265]: 2025-09-30 07:44:08.065 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:44:08 compute-0 nova_compute[189265]: 2025-09-30 07:44:08.066 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.103s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:44:08 compute-0 nova_compute[189265]: 2025-09-30 07:44:08.066 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:44:08 compute-0 nova_compute[189265]: 2025-09-30 07:44:08.066 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Sep 30 07:44:08 compute-0 nova_compute[189265]: 2025-09-30 07:44:08.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:44:09 compute-0 nova_compute[189265]: 2025-09-30 07:44:09.065 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:44:09 compute-0 podman[224340]: 2025-09-30 07:44:09.475991092 +0000 UTC m=+0.049393818 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible)
Sep 30 07:44:09 compute-0 nova_compute[189265]: 2025-09-30 07:44:09.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:44:09 compute-0 podman[224339]: 2025-09-30 07:44:09.503321637 +0000 UTC m=+0.071969047 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 07:44:09 compute-0 podman[224341]: 2025-09-30 07:44:09.525710729 +0000 UTC m=+0.094154973 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Sep 30 07:44:11 compute-0 nova_compute[189265]: 2025-09-30 07:44:11.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:44:13 compute-0 nova_compute[189265]: 2025-09-30 07:44:13.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:44:13 compute-0 nova_compute[189265]: 2025-09-30 07:44:13.783 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:44:14 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:44:14.295 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:04:a5:0f 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-b5627ebe-9328-432f-88fb-b5b539662efd', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b5627ebe-9328-432f-88fb-b5b539662efd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e6633f775c5d46dc9c6c213b63954b2f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d4956bab-94e0-436a-b508-eeb3061671e6, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=694a3f4b-a908-40fb-abbe-4687074d8093) old=Port_Binding(mac=['fa:16:3e:04:a5:0f'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-b5627ebe-9328-432f-88fb-b5b539662efd', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b5627ebe-9328-432f-88fb-b5b539662efd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'e6633f775c5d46dc9c6c213b63954b2f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:44:14 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:44:14.296 100322 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 694a3f4b-a908-40fb-abbe-4687074d8093 in datapath b5627ebe-9328-432f-88fb-b5b539662efd updated
Sep 30 07:44:14 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:44:14.298 100322 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b5627ebe-9328-432f-88fb-b5b539662efd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 07:44:14 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:44:14.299 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[741eb402-0616-4799-8f14-d87e1f157449]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:44:14 compute-0 nova_compute[189265]: 2025-09-30 07:44:14.495 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:44:18 compute-0 nova_compute[189265]: 2025-09-30 07:44:18.497 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:44:19 compute-0 nova_compute[189265]: 2025-09-30 07:44:19.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:44:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:44:20.587 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:44:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:44:20.588 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:44:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:44:20.588 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:44:23 compute-0 podman[224402]: 2025-09-30 07:44:23.47487957 +0000 UTC m=+0.056026120 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 07:44:23 compute-0 nova_compute[189265]: 2025-09-30 07:44:23.500 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:44:24 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:44:24.206 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c6:56:c4 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-bc16084e-e9e6-4d7c-8613-7b9d67ad61cd', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bc16084e-e9e6-4d7c-8613-7b9d67ad61cd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dd45f15fbdba414c8d395e5ff149cbc4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3504d9ac-3611-466a-acbd-fc84b085f8fa, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=a064443b-2f89-45f6-8045-cba0597de88d) old=Port_Binding(mac=['fa:16:3e:c6:56:c4'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-bc16084e-e9e6-4d7c-8613-7b9d67ad61cd', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bc16084e-e9e6-4d7c-8613-7b9d67ad61cd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dd45f15fbdba414c8d395e5ff149cbc4', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:44:24 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:44:24.207 100322 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port a064443b-2f89-45f6-8045-cba0597de88d in datapath bc16084e-e9e6-4d7c-8613-7b9d67ad61cd updated
Sep 30 07:44:24 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:44:24.209 100322 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bc16084e-e9e6-4d7c-8613-7b9d67ad61cd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 07:44:24 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:44:24.210 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[4880e3c6-8b73-4201-abac-347dcb65d9c9]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:44:24 compute-0 nova_compute[189265]: 2025-09-30 07:44:24.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:44:28 compute-0 nova_compute[189265]: 2025-09-30 07:44:28.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:44:29 compute-0 nova_compute[189265]: 2025-09-30 07:44:29.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:44:29 compute-0 podman[199733]: time="2025-09-30T07:44:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:44:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:44:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:44:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:44:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3017 "" "Go-http-client/1.1"
Sep 30 07:44:30 compute-0 nova_compute[189265]: 2025-09-30 07:44:30.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:44:30 compute-0 nova_compute[189265]: 2025-09-30 07:44:30.788 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Sep 30 07:44:31 compute-0 nova_compute[189265]: 2025-09-30 07:44:31.296 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Sep 30 07:44:31 compute-0 openstack_network_exporter[201859]: ERROR   07:44:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:44:31 compute-0 openstack_network_exporter[201859]: ERROR   07:44:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:44:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:44:31 compute-0 openstack_network_exporter[201859]: ERROR   07:44:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:44:31 compute-0 openstack_network_exporter[201859]: ERROR   07:44:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:44:31 compute-0 openstack_network_exporter[201859]: ERROR   07:44:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:44:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:44:33 compute-0 nova_compute[189265]: 2025-09-30 07:44:33.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:44:33 compute-0 podman[224428]: 2025-09-30 07:44:33.505559577 +0000 UTC m=+0.085392202 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_id=iscsid)
Sep 30 07:44:34 compute-0 nova_compute[189265]: 2025-09-30 07:44:34.505 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:44:36 compute-0 podman[224449]: 2025-09-30 07:44:36.497706649 +0000 UTC m=+0.080763349 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, config_id=edpm, name=ubi9-minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., version=9.6, distribution-scope=public, container_name=openstack_network_exporter, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible)
Sep 30 07:44:37 compute-0 ovn_controller[91436]: 2025-09-30T07:44:37Z|00270|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Sep 30 07:44:38 compute-0 nova_compute[189265]: 2025-09-30 07:44:38.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:44:39 compute-0 nova_compute[189265]: 2025-09-30 07:44:39.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:44:40 compute-0 podman[224471]: 2025-09-30 07:44:40.500677136 +0000 UTC m=+0.072040489 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 07:44:40 compute-0 podman[224470]: 2025-09-30 07:44:40.515764609 +0000 UTC m=+0.090060616 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Sep 30 07:44:40 compute-0 podman[224472]: 2025-09-30 07:44:40.526359873 +0000 UTC m=+0.099741654 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Sep 30 07:44:42 compute-0 nova_compute[189265]: 2025-09-30 07:44:42.641 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:44:43 compute-0 nova_compute[189265]: 2025-09-30 07:44:43.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:44:44 compute-0 unix_chkpwd[224538]: password check failed for user (root)
Sep 30 07:44:44 compute-0 sshd-session[224536]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=52.224.109.126  user=root
Sep 30 07:44:44 compute-0 nova_compute[189265]: 2025-09-30 07:44:44.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:44:44 compute-0 sshd-session[224539]: Invalid user pi from 52.224.109.126 port 48654
Sep 30 07:44:44 compute-0 sshd-session[224539]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:44:44 compute-0 sshd-session[224539]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=52.224.109.126
Sep 30 07:44:45 compute-0 sshd-session[224541]: Invalid user hive from 52.224.109.126 port 48660
Sep 30 07:44:45 compute-0 sshd-session[224541]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:44:45 compute-0 sshd-session[224541]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=52.224.109.126
Sep 30 07:44:46 compute-0 sshd-session[224536]: Failed password for root from 52.224.109.126 port 48652 ssh2
Sep 30 07:44:46 compute-0 sshd-session[224536]: Connection closed by authenticating user root 52.224.109.126 port 48652 [preauth]
Sep 30 07:44:46 compute-0 sshd-session[224543]: Invalid user git from 52.224.109.126 port 48672
Sep 30 07:44:46 compute-0 sshd-session[224543]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:44:46 compute-0 sshd-session[224543]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=52.224.109.126
Sep 30 07:44:46 compute-0 sshd-session[224539]: Failed password for invalid user pi from 52.224.109.126 port 48654 ssh2
Sep 30 07:44:47 compute-0 sshd-session[224539]: Connection closed by invalid user pi 52.224.109.126 port 48654 [preauth]
Sep 30 07:44:47 compute-0 sshd-session[224541]: Failed password for invalid user hive from 52.224.109.126 port 48660 ssh2
Sep 30 07:44:47 compute-0 sshd-session[224545]: Invalid user wang from 52.224.109.126 port 48674
Sep 30 07:44:47 compute-0 sshd-session[224541]: Connection closed by invalid user hive 52.224.109.126 port 48660 [preauth]
Sep 30 07:44:47 compute-0 sshd-session[224545]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:44:47 compute-0 sshd-session[224545]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=52.224.109.126
Sep 30 07:44:48 compute-0 nova_compute[189265]: 2025-09-30 07:44:48.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:44:48 compute-0 sshd-session[224547]: Invalid user nginx from 52.224.109.126 port 48688
Sep 30 07:44:48 compute-0 sshd-session[224547]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:44:48 compute-0 sshd-session[224547]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=52.224.109.126
Sep 30 07:44:49 compute-0 sshd-session[224543]: Failed password for invalid user git from 52.224.109.126 port 48672 ssh2
Sep 30 07:44:49 compute-0 sshd-session[224549]: Invalid user mongo from 52.224.109.126 port 48702
Sep 30 07:44:49 compute-0 sshd-session[224549]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:44:49 compute-0 sshd-session[224549]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=52.224.109.126
Sep 30 07:44:49 compute-0 nova_compute[189265]: 2025-09-30 07:44:49.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:44:50 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:44:50.181 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=31, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '1a:26:7c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2e:60:fa:91:d0:34'}, ipsec=False) old=SB_Global(nb_cfg=30) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:44:50 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:44:50.182 100322 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 07:44:50 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:44:50.185 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=01429670-4ea1-4dab-babc-4bc628cc01bb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '31'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:44:50 compute-0 nova_compute[189265]: 2025-09-30 07:44:50.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:44:50 compute-0 sshd-session[224545]: Failed password for invalid user wang from 52.224.109.126 port 48674 ssh2
Sep 30 07:44:50 compute-0 sshd-session[224552]: Invalid user user from 52.224.109.126 port 48714
Sep 30 07:44:50 compute-0 sshd-session[224552]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:44:50 compute-0 sshd-session[224552]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=52.224.109.126
Sep 30 07:44:50 compute-0 sshd-session[224543]: Connection closed by invalid user git 52.224.109.126 port 48672 [preauth]
Sep 30 07:44:50 compute-0 sshd-session[224547]: Failed password for invalid user nginx from 52.224.109.126 port 48688 ssh2
Sep 30 07:44:50 compute-0 sshd-session[224545]: Connection closed by invalid user wang 52.224.109.126 port 48674 [preauth]
Sep 30 07:44:51 compute-0 sshd[124648]: drop connection #3 from [52.224.109.126]:48718 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:44:51 compute-0 sshd-session[224547]: Connection closed by invalid user nginx 52.224.109.126 port 48688 [preauth]
Sep 30 07:44:51 compute-0 sshd-session[224549]: Failed password for invalid user mongo from 52.224.109.126 port 48702 ssh2
Sep 30 07:44:51 compute-0 sshd-session[224552]: Failed password for invalid user user from 52.224.109.126 port 48714 ssh2
Sep 30 07:44:52 compute-0 sshd[124648]: drop connection #2 from [52.224.109.126]:48722 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:44:52 compute-0 nova_compute[189265]: 2025-09-30 07:44:52.292 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:44:52 compute-0 sshd-session[224549]: Connection closed by invalid user mongo 52.224.109.126 port 48702 [preauth]
Sep 30 07:44:52 compute-0 sshd-session[224552]: Connection closed by invalid user user 52.224.109.126 port 48714 [preauth]
Sep 30 07:44:52 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:48730 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:44:53 compute-0 nova_compute[189265]: 2025-09-30 07:44:53.511 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:44:53 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:46318 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:44:54 compute-0 podman[224554]: 2025-09-30 07:44:54.490032498 +0000 UTC m=+0.072217774 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 07:44:54 compute-0 nova_compute[189265]: 2025-09-30 07:44:54.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:44:54 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:46322 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:44:55 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:46326 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:44:55 compute-0 nova_compute[189265]: 2025-09-30 07:44:55.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:44:56 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:46342 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:44:56 compute-0 nova_compute[189265]: 2025-09-30 07:44:56.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:44:57 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:46346 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:44:58 compute-0 nova_compute[189265]: 2025-09-30 07:44:58.513 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:44:58 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:46358 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:44:58 compute-0 nova_compute[189265]: 2025-09-30 07:44:58.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:44:58 compute-0 nova_compute[189265]: 2025-09-30 07:44:58.788 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:44:59 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:46366 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:44:59 compute-0 nova_compute[189265]: 2025-09-30 07:44:59.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:44:59 compute-0 podman[199733]: time="2025-09-30T07:44:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:44:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:44:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:44:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:44:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3008 "" "Go-http-client/1.1"
Sep 30 07:45:00 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:46368 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:01 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:46384 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:01 compute-0 openstack_network_exporter[201859]: ERROR   07:45:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:45:01 compute-0 openstack_network_exporter[201859]: ERROR   07:45:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:45:01 compute-0 openstack_network_exporter[201859]: ERROR   07:45:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:45:01 compute-0 openstack_network_exporter[201859]: ERROR   07:45:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:45:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:45:01 compute-0 openstack_network_exporter[201859]: ERROR   07:45:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:45:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:45:02 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:46388 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:03 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:52964 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:03 compute-0 nova_compute[189265]: 2025-09-30 07:45:03.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:45:04 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:52976 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:04 compute-0 podman[224581]: 2025-09-30 07:45:04.453790874 +0000 UTC m=+0.042998805 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.build-date=20250930, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4)
Sep 30 07:45:04 compute-0 nova_compute[189265]: 2025-09-30 07:45:04.551 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:45:04 compute-0 nova_compute[189265]: 2025-09-30 07:45:04.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:45:04 compute-0 nova_compute[189265]: 2025-09-30 07:45:04.789 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:45:04 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:52992 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:05 compute-0 nova_compute[189265]: 2025-09-30 07:45:05.308 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:45:05 compute-0 nova_compute[189265]: 2025-09-30 07:45:05.308 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:45:05 compute-0 nova_compute[189265]: 2025-09-30 07:45:05.309 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:45:05 compute-0 nova_compute[189265]: 2025-09-30 07:45:05.309 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:45:05 compute-0 nova_compute[189265]: 2025-09-30 07:45:05.461 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:45:05 compute-0 nova_compute[189265]: 2025-09-30 07:45:05.462 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:45:05 compute-0 nova_compute[189265]: 2025-09-30 07:45:05.477 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.015s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:45:05 compute-0 nova_compute[189265]: 2025-09-30 07:45:05.478 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5850MB free_disk=73.3036003112793GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:45:05 compute-0 nova_compute[189265]: 2025-09-30 07:45:05.478 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:45:05 compute-0 nova_compute[189265]: 2025-09-30 07:45:05.478 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:45:05 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:53004 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:06 compute-0 nova_compute[189265]: 2025-09-30 07:45:06.602 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:45:06 compute-0 nova_compute[189265]: 2025-09-30 07:45:06.602 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:45:05 up  1:42,  0 user,  load average: 0.34, 0.37, 0.31\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:45:06 compute-0 nova_compute[189265]: 2025-09-30 07:45:06.677 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Refreshing inventories for resource provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Sep 30 07:45:06 compute-0 nova_compute[189265]: 2025-09-30 07:45:06.725 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Updating ProviderTree inventory for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Sep 30 07:45:06 compute-0 nova_compute[189265]: 2025-09-30 07:45:06.726 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Updating inventory in ProviderTree for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Sep 30 07:45:06 compute-0 nova_compute[189265]: 2025-09-30 07:45:06.741 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Refreshing aggregate associations for resource provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Sep 30 07:45:06 compute-0 nova_compute[189265]: 2025-09-30 07:45:06.759 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Refreshing trait associations for resource provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc, traits: COMPUTE_SECURITY_TPM_CRB,HW_ARCH_X86_64,HW_CPU_X86_F16C,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AESNI,COMPUTE_STORAGE_VIRTIO_FS,HW_CPU_X86_FMA3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE,COMPUTE_ACCELERATORS,HW_CPU_X86_SVM,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_EXTEND,COMPUTE_ARCH_X86_64,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SHA,HW_CPU_X86_BMI,COMPUTE_SOUND_MODEL_USB,COMPUTE_SOUND_MODEL_SB16,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AMD_SVM,HW_CPU_X86_BMI2,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_TIS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_AVX,COMPUTE_SOUND_MODEL_AC97,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_ABM,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_VIF_MODEL_IGB,HW_CPU_X86_SSE41,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SOUND_MODEL_ICH6,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_MMX,HW_CPU_X86_SSE4A,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_PCSPK,HW_CPU_X86_CLMUL _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Sep 30 07:45:06 compute-0 nova_compute[189265]: 2025-09-30 07:45:06.780 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:45:06 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:53012 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:07 compute-0 nova_compute[189265]: 2025-09-30 07:45:07.290 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:45:07 compute-0 podman[224602]: 2025-09-30 07:45:07.460319469 +0000 UTC m=+0.051629743 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.buildah.version=1.33.7, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, name=ubi9-minimal, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, distribution-scope=public, io.openshift.expose-services=)
Sep 30 07:45:07 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:53026 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:07 compute-0 nova_compute[189265]: 2025-09-30 07:45:07.799 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:45:07 compute-0 nova_compute[189265]: 2025-09-30 07:45:07.800 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.321s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:45:08 compute-0 nova_compute[189265]: 2025-09-30 07:45:08.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:45:08 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:53038 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:08 compute-0 nova_compute[189265]: 2025-09-30 07:45:08.800 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:45:09 compute-0 nova_compute[189265]: 2025-09-30 07:45:09.556 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:45:09 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:53048 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:10 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:53052 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:11 compute-0 podman[224623]: 2025-09-30 07:45:11.498256019 +0000 UTC m=+0.074948162 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible)
Sep 30 07:45:11 compute-0 podman[224624]: 2025-09-30 07:45:11.506093454 +0000 UTC m=+0.092837636 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Sep 30 07:45:11 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:53056 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:11 compute-0 podman[224625]: 2025-09-30 07:45:11.514963329 +0000 UTC m=+0.088095500 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250930, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Sep 30 07:45:11 compute-0 nova_compute[189265]: 2025-09-30 07:45:11.790 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:45:12 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:53066 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:13 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:56598 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:13 compute-0 nova_compute[189265]: 2025-09-30 07:45:13.519 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:45:14 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:56608 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:14 compute-0 nova_compute[189265]: 2025-09-30 07:45:14.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:45:15 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:56618 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:16 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:56632 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:17 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:56642 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:17 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:56650 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:18 compute-0 nova_compute[189265]: 2025-09-30 07:45:18.521 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:45:18 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:56666 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:19 compute-0 nova_compute[189265]: 2025-09-30 07:45:19.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:45:19 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:56676 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:45:20.592 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:45:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:45:20.592 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:45:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:45:20.592 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:45:20 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:56678 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:21 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:56684 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:22 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:56690 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:23 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:52988 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:23 compute-0 nova_compute[189265]: 2025-09-30 07:45:23.524 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:45:24 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:53004 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:24 compute-0 nova_compute[189265]: 2025-09-30 07:45:24.638 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:45:25 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:53012 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:25 compute-0 podman[224689]: 2025-09-30 07:45:25.511279154 +0000 UTC m=+0.089436653 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 07:45:26 compute-0 sshd-session[224713]: Invalid user plexserver from 52.224.109.126 port 53028
Sep 30 07:45:26 compute-0 sshd-session[224713]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:45:26 compute-0 sshd-session[224713]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=52.224.109.126
Sep 30 07:45:27 compute-0 sshd-session[224715]: Invalid user sonar from 52.224.109.126 port 53042
Sep 30 07:45:27 compute-0 sshd-session[224715]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:45:27 compute-0 sshd-session[224715]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=52.224.109.126
Sep 30 07:45:27 compute-0 nova_compute[189265]: 2025-09-30 07:45:27.356 2 DEBUG nova.virt.libvirt.driver [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7e80d397-6a79-43e6-b663-fca5437ca2bb] Creating tmpfile /var/lib/nova/instances/tmphlbzyxc8 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Sep 30 07:45:27 compute-0 nova_compute[189265]: 2025-09-30 07:45:27.357 2 WARNING neutronclient.v2_0.client [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:45:27 compute-0 nova_compute[189265]: 2025-09-30 07:45:27.359 2 DEBUG nova.compute.manager [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphlbzyxc8',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Sep 30 07:45:28 compute-0 sshd-session[224717]: Invalid user app from 52.224.109.126 port 53052
Sep 30 07:45:28 compute-0 sshd-session[224717]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:45:28 compute-0 sshd-session[224717]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=52.224.109.126
Sep 30 07:45:28 compute-0 nova_compute[189265]: 2025-09-30 07:45:28.526 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:45:28 compute-0 sshd-session[224713]: Failed password for invalid user plexserver from 52.224.109.126 port 53028 ssh2
Sep 30 07:45:29 compute-0 sshd-session[224719]: Invalid user tools from 52.224.109.126 port 53054
Sep 30 07:45:29 compute-0 sshd-session[224719]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:45:29 compute-0 sshd-session[224719]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=52.224.109.126
Sep 30 07:45:29 compute-0 sshd-session[224715]: Failed password for invalid user sonar from 52.224.109.126 port 53042 ssh2
Sep 30 07:45:29 compute-0 nova_compute[189265]: 2025-09-30 07:45:29.397 2 WARNING neutronclient.v2_0.client [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:45:29 compute-0 sshd-session[224715]: Connection closed by invalid user sonar 52.224.109.126 port 53042 [preauth]
Sep 30 07:45:29 compute-0 nova_compute[189265]: 2025-09-30 07:45:29.641 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:45:29 compute-0 podman[199733]: time="2025-09-30T07:45:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:45:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:45:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:45:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:45:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3010 "" "Go-http-client/1.1"
Sep 30 07:45:30 compute-0 sshd-session[224721]: Invalid user lighthouse from 52.224.109.126 port 53066
Sep 30 07:45:30 compute-0 sshd-session[224721]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:45:30 compute-0 sshd-session[224721]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=52.224.109.126
Sep 30 07:45:30 compute-0 sshd-session[224717]: Failed password for invalid user app from 52.224.109.126 port 53052 ssh2
Sep 30 07:45:30 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Sep 30 07:45:30 compute-0 sshd-session[224713]: Connection closed by invalid user plexserver 52.224.109.126 port 53028 [preauth]
Sep 30 07:45:31 compute-0 sshd-session[224724]: Invalid user mysql from 52.224.109.126 port 53080
Sep 30 07:45:31 compute-0 sshd-session[224724]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:45:31 compute-0 sshd-session[224724]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=52.224.109.126
Sep 30 07:45:31 compute-0 openstack_network_exporter[201859]: ERROR   07:45:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:45:31 compute-0 openstack_network_exporter[201859]: ERROR   07:45:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:45:31 compute-0 openstack_network_exporter[201859]: ERROR   07:45:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:45:31 compute-0 openstack_network_exporter[201859]: ERROR   07:45:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:45:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:45:31 compute-0 openstack_network_exporter[201859]: ERROR   07:45:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:45:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:45:31 compute-0 sshd-session[224719]: Failed password for invalid user tools from 52.224.109.126 port 53054 ssh2
Sep 30 07:45:31 compute-0 unix_chkpwd[224728]: password check failed for user (root)
Sep 30 07:45:31 compute-0 sshd-session[224726]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=52.224.109.126  user=root
Sep 30 07:45:32 compute-0 sshd-session[224717]: Connection closed by invalid user app 52.224.109.126 port 53052 [preauth]
Sep 30 07:45:32 compute-0 sshd-session[224721]: Failed password for invalid user lighthouse from 52.224.109.126 port 53066 ssh2
Sep 30 07:45:32 compute-0 sshd-session[224719]: Connection closed by invalid user tools 52.224.109.126 port 53054 [preauth]
Sep 30 07:45:32 compute-0 sshd-session[224721]: Connection closed by invalid user lighthouse 52.224.109.126 port 53066 [preauth]
Sep 30 07:45:32 compute-0 sshd-session[224729]: Invalid user gpadmin from 52.224.109.126 port 53086
Sep 30 07:45:32 compute-0 sshd-session[224729]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:45:32 compute-0 sshd-session[224729]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=52.224.109.126
Sep 30 07:45:33 compute-0 sshd-session[224724]: Failed password for invalid user mysql from 52.224.109.126 port 53080 ssh2
Sep 30 07:45:33 compute-0 sshd[124648]: drop connection #3 from [52.224.109.126]:35112 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:33 compute-0 nova_compute[189265]: 2025-09-30 07:45:33.569 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:45:33 compute-0 sshd-session[224724]: Connection closed by invalid user mysql 52.224.109.126 port 53080 [preauth]
Sep 30 07:45:34 compute-0 nova_compute[189265]: 2025-09-30 07:45:34.040 2 DEBUG nova.compute.manager [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphlbzyxc8',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='7e80d397-6a79-43e6-b663-fca5437ca2bb',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Sep 30 07:45:34 compute-0 unix_chkpwd[224733]: password check failed for user (root)
Sep 30 07:45:34 compute-0 sshd-session[224731]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.159  user=root
Sep 30 07:45:34 compute-0 sshd-session[224726]: Failed password for root from 52.224.109.126 port 53084 ssh2
Sep 30 07:45:34 compute-0 sshd-session[224729]: Failed password for invalid user gpadmin from 52.224.109.126 port 53086 ssh2
Sep 30 07:45:34 compute-0 sshd[124648]: drop connection #3 from [52.224.109.126]:35118 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:34 compute-0 nova_compute[189265]: 2025-09-30 07:45:34.685 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:45:35 compute-0 nova_compute[189265]: 2025-09-30 07:45:35.059 2 DEBUG oslo_concurrency.lockutils [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "refresh_cache-7e80d397-6a79-43e6-b663-fca5437ca2bb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:45:35 compute-0 nova_compute[189265]: 2025-09-30 07:45:35.060 2 DEBUG oslo_concurrency.lockutils [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquired lock "refresh_cache-7e80d397-6a79-43e6-b663-fca5437ca2bb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:45:35 compute-0 nova_compute[189265]: 2025-09-30 07:45:35.060 2 DEBUG nova.network.neutron [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7e80d397-6a79-43e6-b663-fca5437ca2bb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 07:45:35 compute-0 sshd[124648]: drop connection #3 from [52.224.109.126]:35132 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:35 compute-0 podman[224734]: 2025-09-30 07:45:35.46668678 +0000 UTC m=+0.057603914 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 07:45:35 compute-0 nova_compute[189265]: 2025-09-30 07:45:35.568 2 WARNING neutronclient.v2_0.client [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:45:35 compute-0 sshd-session[224729]: Connection closed by invalid user gpadmin 52.224.109.126 port 53086 [preauth]
Sep 30 07:45:35 compute-0 sshd-session[224726]: Connection closed by authenticating user root 52.224.109.126 port 53084 [preauth]
Sep 30 07:45:36 compute-0 sshd[124648]: drop connection #1 from [52.224.109.126]:35148 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:36 compute-0 sshd-session[224731]: Failed password for root from 193.46.255.159 port 42762 ssh2
Sep 30 07:45:37 compute-0 sshd[124648]: drop connection #1 from [52.224.109.126]:35164 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:37 compute-0 nova_compute[189265]: 2025-09-30 07:45:37.296 2 WARNING neutronclient.v2_0.client [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:45:37 compute-0 nova_compute[189265]: 2025-09-30 07:45:37.503 2 DEBUG nova.network.neutron [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7e80d397-6a79-43e6-b663-fca5437ca2bb] Updating instance_info_cache with network_info: [{"id": "62fc82a5-e7b3-4c47-be06-d9404d5372a7", "address": "fa:16:3e:2a:58:1f", "network": {"id": "b5627ebe-9328-432f-88fb-b5b539662efd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-337845770-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6633f775c5d46dc9c6c213b63954b2f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62fc82a5-e7", "ovs_interfaceid": "62fc82a5-e7b3-4c47-be06-d9404d5372a7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:45:38 compute-0 nova_compute[189265]: 2025-09-30 07:45:38.010 2 DEBUG oslo_concurrency.lockutils [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Releasing lock "refresh_cache-7e80d397-6a79-43e6-b663-fca5437ca2bb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:45:38 compute-0 nova_compute[189265]: 2025-09-30 07:45:38.023 2 DEBUG nova.virt.libvirt.driver [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7e80d397-6a79-43e6-b663-fca5437ca2bb] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphlbzyxc8',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='7e80d397-6a79-43e6-b663-fca5437ca2bb',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Sep 30 07:45:38 compute-0 nova_compute[189265]: 2025-09-30 07:45:38.024 2 DEBUG nova.virt.libvirt.driver [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7e80d397-6a79-43e6-b663-fca5437ca2bb] Creating instance directory: /var/lib/nova/instances/7e80d397-6a79-43e6-b663-fca5437ca2bb pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Sep 30 07:45:38 compute-0 nova_compute[189265]: 2025-09-30 07:45:38.024 2 DEBUG nova.virt.libvirt.driver [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7e80d397-6a79-43e6-b663-fca5437ca2bb] Creating disk.info with the contents: {'/var/lib/nova/instances/7e80d397-6a79-43e6-b663-fca5437ca2bb/disk': 'qcow2', '/var/lib/nova/instances/7e80d397-6a79-43e6-b663-fca5437ca2bb/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Sep 30 07:45:38 compute-0 nova_compute[189265]: 2025-09-30 07:45:38.025 2 DEBUG nova.virt.libvirt.driver [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7e80d397-6a79-43e6-b663-fca5437ca2bb] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Sep 30 07:45:38 compute-0 nova_compute[189265]: 2025-09-30 07:45:38.025 2 DEBUG nova.objects.instance [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lazy-loading 'trusted_certs' on Instance uuid 7e80d397-6a79-43e6-b663-fca5437ca2bb obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:45:38 compute-0 sshd[124648]: drop connection #1 from [52.224.109.126]:35178 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:38 compute-0 unix_chkpwd[224775]: password check failed for user (root)
Sep 30 07:45:38 compute-0 podman[224754]: 2025-09-30 07:45:38.486101482 +0000 UTC m=+0.063266707 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, maintainer=Red Hat, Inc., vcs-type=git, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Sep 30 07:45:38 compute-0 nova_compute[189265]: 2025-09-30 07:45:38.532 2 DEBUG oslo_utils.imageutils.format_inspector [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:45:38 compute-0 nova_compute[189265]: 2025-09-30 07:45:38.539 2 DEBUG oslo_utils.imageutils.format_inspector [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:45:38 compute-0 nova_compute[189265]: 2025-09-30 07:45:38.541 2 DEBUG oslo_concurrency.processutils [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:45:38 compute-0 nova_compute[189265]: 2025-09-30 07:45:38.573 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:45:38 compute-0 nova_compute[189265]: 2025-09-30 07:45:38.627 2 DEBUG oslo_concurrency.processutils [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:45:38 compute-0 nova_compute[189265]: 2025-09-30 07:45:38.628 2 DEBUG oslo_concurrency.lockutils [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "649c128805005f3dfb5a93843c58a367cdfe939d" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:45:38 compute-0 nova_compute[189265]: 2025-09-30 07:45:38.628 2 DEBUG oslo_concurrency.lockutils [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "649c128805005f3dfb5a93843c58a367cdfe939d" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:45:38 compute-0 nova_compute[189265]: 2025-09-30 07:45:38.629 2 DEBUG oslo_utils.imageutils.format_inspector [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:45:38 compute-0 nova_compute[189265]: 2025-09-30 07:45:38.631 2 DEBUG oslo_utils.imageutils.format_inspector [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:45:38 compute-0 nova_compute[189265]: 2025-09-30 07:45:38.631 2 DEBUG oslo_concurrency.processutils [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:45:38 compute-0 nova_compute[189265]: 2025-09-30 07:45:38.679 2 DEBUG oslo_concurrency.processutils [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:45:38 compute-0 nova_compute[189265]: 2025-09-30 07:45:38.680 2 DEBUG oslo_concurrency.processutils [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d,backing_fmt=raw /var/lib/nova/instances/7e80d397-6a79-43e6-b663-fca5437ca2bb/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:45:38 compute-0 nova_compute[189265]: 2025-09-30 07:45:38.707 2 DEBUG oslo_concurrency.processutils [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d,backing_fmt=raw /var/lib/nova/instances/7e80d397-6a79-43e6-b663-fca5437ca2bb/disk 1073741824" returned: 0 in 0.028s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:45:38 compute-0 nova_compute[189265]: 2025-09-30 07:45:38.708 2 DEBUG oslo_concurrency.lockutils [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "649c128805005f3dfb5a93843c58a367cdfe939d" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.080s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:45:38 compute-0 nova_compute[189265]: 2025-09-30 07:45:38.709 2 DEBUG oslo_concurrency.processutils [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:45:38 compute-0 nova_compute[189265]: 2025-09-30 07:45:38.758 2 DEBUG oslo_concurrency.processutils [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:45:38 compute-0 nova_compute[189265]: 2025-09-30 07:45:38.759 2 DEBUG nova.virt.disk.api [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Checking if we can resize image /var/lib/nova/instances/7e80d397-6a79-43e6-b663-fca5437ca2bb/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Sep 30 07:45:38 compute-0 nova_compute[189265]: 2025-09-30 07:45:38.760 2 DEBUG oslo_concurrency.processutils [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e80d397-6a79-43e6-b663-fca5437ca2bb/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:45:38 compute-0 nova_compute[189265]: 2025-09-30 07:45:38.813 2 DEBUG oslo_concurrency.processutils [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7e80d397-6a79-43e6-b663-fca5437ca2bb/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:45:38 compute-0 nova_compute[189265]: 2025-09-30 07:45:38.815 2 DEBUG nova.virt.disk.api [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Cannot resize image /var/lib/nova/instances/7e80d397-6a79-43e6-b663-fca5437ca2bb/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Sep 30 07:45:38 compute-0 nova_compute[189265]: 2025-09-30 07:45:38.816 2 DEBUG nova.objects.instance [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lazy-loading 'migration_context' on Instance uuid 7e80d397-6a79-43e6-b663-fca5437ca2bb obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:45:39 compute-0 sshd[124648]: drop connection #1 from [52.224.109.126]:35186 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:39 compute-0 nova_compute[189265]: 2025-09-30 07:45:39.326 2 DEBUG nova.objects.base [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Object Instance<7e80d397-6a79-43e6-b663-fca5437ca2bb> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Sep 30 07:45:39 compute-0 nova_compute[189265]: 2025-09-30 07:45:39.327 2 DEBUG oslo_concurrency.processutils [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/7e80d397-6a79-43e6-b663-fca5437ca2bb/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:45:39 compute-0 nova_compute[189265]: 2025-09-30 07:45:39.358 2 DEBUG oslo_concurrency.processutils [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/7e80d397-6a79-43e6-b663-fca5437ca2bb/disk.config 497664" returned: 0 in 0.031s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:45:39 compute-0 nova_compute[189265]: 2025-09-30 07:45:39.359 2 DEBUG nova.virt.libvirt.driver [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7e80d397-6a79-43e6-b663-fca5437ca2bb] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Sep 30 07:45:39 compute-0 nova_compute[189265]: 2025-09-30 07:45:39.360 2 DEBUG nova.virt.libvirt.vif [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T07:44:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-1460458105',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-1460458105',id=28,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T07:44:50Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='dd45f15fbdba414c8d395e5ff149cbc4',ramdisk_id='',reservation_id='r-0xxy3odk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-359850667',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-359850667-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T07:44:50Z,user_data=None,user_id='bf911d50b77e4e20a250e642038c8043',uuid=7e80d397-6a79-43e6-b663-fca5437ca2bb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "62fc82a5-e7b3-4c47-be06-d9404d5372a7", "address": "fa:16:3e:2a:58:1f", "network": {"id": "b5627ebe-9328-432f-88fb-b5b539662efd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-337845770-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6633f775c5d46dc9c6c213b63954b2f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap62fc82a5-e7", "ovs_interfaceid": "62fc82a5-e7b3-4c47-be06-d9404d5372a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 07:45:39 compute-0 nova_compute[189265]: 2025-09-30 07:45:39.360 2 DEBUG nova.network.os_vif_util [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Converting VIF {"id": "62fc82a5-e7b3-4c47-be06-d9404d5372a7", "address": "fa:16:3e:2a:58:1f", "network": {"id": "b5627ebe-9328-432f-88fb-b5b539662efd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-337845770-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6633f775c5d46dc9c6c213b63954b2f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap62fc82a5-e7", "ovs_interfaceid": "62fc82a5-e7b3-4c47-be06-d9404d5372a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:45:39 compute-0 nova_compute[189265]: 2025-09-30 07:45:39.361 2 DEBUG nova.network.os_vif_util [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2a:58:1f,bridge_name='br-int',has_traffic_filtering=True,id=62fc82a5-e7b3-4c47-be06-d9404d5372a7,network=Network(b5627ebe-9328-432f-88fb-b5b539662efd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap62fc82a5-e7') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:45:39 compute-0 nova_compute[189265]: 2025-09-30 07:45:39.362 2 DEBUG os_vif [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2a:58:1f,bridge_name='br-int',has_traffic_filtering=True,id=62fc82a5-e7b3-4c47-be06-d9404d5372a7,network=Network(b5627ebe-9328-432f-88fb-b5b539662efd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap62fc82a5-e7') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 07:45:39 compute-0 nova_compute[189265]: 2025-09-30 07:45:39.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:45:39 compute-0 nova_compute[189265]: 2025-09-30 07:45:39.364 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:45:39 compute-0 nova_compute[189265]: 2025-09-30 07:45:39.364 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:45:39 compute-0 nova_compute[189265]: 2025-09-30 07:45:39.365 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:45:39 compute-0 nova_compute[189265]: 2025-09-30 07:45:39.365 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '1305b663-9121-582c-88d1-c708dda5f1ec', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:45:39 compute-0 nova_compute[189265]: 2025-09-30 07:45:39.367 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:45:39 compute-0 nova_compute[189265]: 2025-09-30 07:45:39.369 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:45:39 compute-0 nova_compute[189265]: 2025-09-30 07:45:39.371 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:45:39 compute-0 nova_compute[189265]: 2025-09-30 07:45:39.371 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap62fc82a5-e7, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:45:39 compute-0 nova_compute[189265]: 2025-09-30 07:45:39.371 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap62fc82a5-e7, col_values=(('qos', UUID('a1a40aed-0a7d-4c50-97dd-d9d5fa41c72d')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:45:39 compute-0 nova_compute[189265]: 2025-09-30 07:45:39.371 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap62fc82a5-e7, col_values=(('external_ids', {'iface-id': '62fc82a5-e7b3-4c47-be06-d9404d5372a7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2a:58:1f', 'vm-uuid': '7e80d397-6a79-43e6-b663-fca5437ca2bb'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:45:39 compute-0 nova_compute[189265]: 2025-09-30 07:45:39.373 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:45:39 compute-0 NetworkManager[51813]: <info>  [1759218339.3742] manager: (tap62fc82a5-e7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/93)
Sep 30 07:45:39 compute-0 nova_compute[189265]: 2025-09-30 07:45:39.375 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:45:39 compute-0 nova_compute[189265]: 2025-09-30 07:45:39.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:45:39 compute-0 nova_compute[189265]: 2025-09-30 07:45:39.379 2 INFO os_vif [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2a:58:1f,bridge_name='br-int',has_traffic_filtering=True,id=62fc82a5-e7b3-4c47-be06-d9404d5372a7,network=Network(b5627ebe-9328-432f-88fb-b5b539662efd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap62fc82a5-e7')
Sep 30 07:45:39 compute-0 nova_compute[189265]: 2025-09-30 07:45:39.379 2 DEBUG nova.virt.libvirt.driver [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Sep 30 07:45:39 compute-0 nova_compute[189265]: 2025-09-30 07:45:39.379 2 DEBUG nova.compute.manager [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphlbzyxc8',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='7e80d397-6a79-43e6-b663-fca5437ca2bb',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Sep 30 07:45:39 compute-0 nova_compute[189265]: 2025-09-30 07:45:39.380 2 WARNING neutronclient.v2_0.client [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:45:39 compute-0 nova_compute[189265]: 2025-09-30 07:45:39.686 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:45:40 compute-0 sshd[124648]: drop connection #1 from [52.224.109.126]:35202 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:40 compute-0 nova_compute[189265]: 2025-09-30 07:45:40.213 2 WARNING neutronclient.v2_0.client [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:45:40 compute-0 sshd-session[224731]: Failed password for root from 193.46.255.159 port 42762 ssh2
Sep 30 07:45:41 compute-0 sshd[124648]: drop connection #1 from [52.224.109.126]:35206 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:41 compute-0 nova_compute[189265]: 2025-09-30 07:45:41.314 2 DEBUG nova.network.neutron [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7e80d397-6a79-43e6-b663-fca5437ca2bb] Port 62fc82a5-e7b3-4c47-be06-d9404d5372a7 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Sep 30 07:45:41 compute-0 nova_compute[189265]: 2025-09-30 07:45:41.329 2 DEBUG nova.compute.manager [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphlbzyxc8',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='7e80d397-6a79-43e6-b663-fca5437ca2bb',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Sep 30 07:45:42 compute-0 sshd[124648]: drop connection #1 from [52.224.109.126]:35220 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:42 compute-0 podman[224796]: 2025-09-30 07:45:42.503808444 +0000 UTC m=+0.085847029 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20250930, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4)
Sep 30 07:45:42 compute-0 podman[224797]: 2025-09-30 07:45:42.526768087 +0000 UTC m=+0.103563060 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Sep 30 07:45:42 compute-0 podman[224798]: 2025-09-30 07:45:42.546642981 +0000 UTC m=+0.121099737 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Sep 30 07:45:42 compute-0 unix_chkpwd[224859]: password check failed for user (root)
Sep 30 07:45:42 compute-0 sshd[124648]: drop connection #1 from [52.224.109.126]:35232 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:43 compute-0 sshd[124648]: drop connection #1 from [52.224.109.126]:54790 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:44 compute-0 systemd[1]: Starting libvirt proxy daemon...
Sep 30 07:45:44 compute-0 systemd[1]: Started libvirt proxy daemon.
Sep 30 07:45:44 compute-0 kernel: tap62fc82a5-e7: entered promiscuous mode
Sep 30 07:45:44 compute-0 NetworkManager[51813]: <info>  [1759218344.3608] manager: (tap62fc82a5-e7): new Tun device (/org/freedesktop/NetworkManager/Devices/94)
Sep 30 07:45:44 compute-0 ovn_controller[91436]: 2025-09-30T07:45:44Z|00271|binding|INFO|Claiming lport 62fc82a5-e7b3-4c47-be06-d9404d5372a7 for this additional chassis.
Sep 30 07:45:44 compute-0 ovn_controller[91436]: 2025-09-30T07:45:44Z|00272|binding|INFO|62fc82a5-e7b3-4c47-be06-d9404d5372a7: Claiming fa:16:3e:2a:58:1f 10.100.0.10
Sep 30 07:45:44 compute-0 nova_compute[189265]: 2025-09-30 07:45:44.365 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:45:44 compute-0 nova_compute[189265]: 2025-09-30 07:45:44.370 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:45:44 compute-0 nova_compute[189265]: 2025-09-30 07:45:44.373 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:45:44.398 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2a:58:1f 10.100.0.10'], port_security=['fa:16:3e:2a:58:1f 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '7e80d397-6a79-43e6-b663-fca5437ca2bb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b5627ebe-9328-432f-88fb-b5b539662efd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dd45f15fbdba414c8d395e5ff149cbc4', 'neutron:revision_number': '10', 'neutron:security_group_ids': '3b4f5d53-fbd5-497e-8555-6b61b9a4d332', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d4956bab-94e0-436a-b508-eeb3061671e6, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=62fc82a5-e7b3-4c47-be06-d9404d5372a7) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:45:44.399 100322 INFO neutron.agent.ovn.metadata.agent [-] Port 62fc82a5-e7b3-4c47-be06-d9404d5372a7 in datapath b5627ebe-9328-432f-88fb-b5b539662efd unbound from our chassis
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:45:44.401 100322 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b5627ebe-9328-432f-88fb-b5b539662efd
Sep 30 07:45:44 compute-0 systemd-machined[149233]: New machine qemu-23-instance-0000001c.
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:45:44.413 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[f9186fe2-d82f-4c66-8f97-b524dad28746]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:45:44.414 100322 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb5627ebe-91 in ovnmeta-b5627ebe-9328-432f-88fb-b5b539662efd namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:45:44.416 210650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb5627ebe-90 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:45:44.416 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[8860a227-31a9-4374-9ccf-416995066f80]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:45:44.416 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[7c3ce34d-1696-45ac-8a79-f3fbd24d2767]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:45:44.435 100440 DEBUG oslo.privsep.daemon [-] privsep: reply[e2b19ea7-bdd8-45ea-a011-6329571a0f33]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:45:44 compute-0 nova_compute[189265]: 2025-09-30 07:45:44.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:45:44 compute-0 systemd[1]: Started Virtual Machine qemu-23-instance-0000001c.
Sep 30 07:45:44 compute-0 nova_compute[189265]: 2025-09-30 07:45:44.474 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:45:44 compute-0 ovn_controller[91436]: 2025-09-30T07:45:44Z|00273|binding|INFO|Setting lport 62fc82a5-e7b3-4c47-be06-d9404d5372a7 ovn-installed in OVS
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:45:44.475 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[0fb0da6f-0bfa-4e21-9cf3-ecbe42127ab7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:45:44 compute-0 systemd-udevd[224895]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:45:44.506 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[fac41cc2-7981-41f6-b9ad-27afdc2d5a04]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:45:44 compute-0 NetworkManager[51813]: <info>  [1759218344.5098] device (tap62fc82a5-e7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 07:45:44 compute-0 NetworkManager[51813]: <info>  [1759218344.5111] device (tap62fc82a5-e7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 07:45:44 compute-0 NetworkManager[51813]: <info>  [1759218344.5155] manager: (tapb5627ebe-90): new Veth device (/org/freedesktop/NetworkManager/Devices/95)
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:45:44.514 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[dfbb09d2-4437-4126-b203-9f42f5a7a6c9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:45:44.556 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[88322513-f517-460a-9286-2e6bff7aa6b7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:45:44.560 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[7a6eb0de-7ecc-4ea1-a9af-a0330d819be8]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:45:44 compute-0 NetworkManager[51813]: <info>  [1759218344.5909] device (tapb5627ebe-90): carrier: link connected
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:45:44.596 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[5b04b849-b244-4458-8e3e-84d31022217d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:45:44.616 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[52bca470-f2d9-4c69-98ea-9510558248a2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb5627ebe-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:04:a5:0f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 69], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 620231, 'reachable_time': 38656, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224925, 'error': None, 'target': 'ovnmeta-b5627ebe-9328-432f-88fb-b5b539662efd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:45:44.632 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[309b5ff5-7543-48d4-a647-428be11f2fab]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe04:a50f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 620231, 'tstamp': 620231}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224926, 'error': None, 'target': 'ovnmeta-b5627ebe-9328-432f-88fb-b5b539662efd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:45:44.650 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[f8b6ef7d-73af-45e0-9dc5-0630733f1d97]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb5627ebe-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:04:a5:0f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 69], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 620231, 'reachable_time': 38656, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224927, 'error': None, 'target': 'ovnmeta-b5627ebe-9328-432f-88fb-b5b539662efd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:45:44.686 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[ade7ac05-010e-4ee6-8b32-246ce75bdd2e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:45:44 compute-0 nova_compute[189265]: 2025-09-30 07:45:44.735 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:45:44 compute-0 sshd[124648]: drop connection #1 from [52.224.109.126]:54798 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:45:44.790 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[2a41f6da-cad8-4afe-8302-6d5db8896257]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:45:44.791 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5627ebe-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:45:44.791 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:45:44.792 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb5627ebe-90, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:45:44 compute-0 kernel: tapb5627ebe-90: entered promiscuous mode
Sep 30 07:45:44 compute-0 nova_compute[189265]: 2025-09-30 07:45:44.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:45:44 compute-0 NetworkManager[51813]: <info>  [1759218344.7962] manager: (tapb5627ebe-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/96)
Sep 30 07:45:44 compute-0 nova_compute[189265]: 2025-09-30 07:45:44.796 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:45:44.797 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb5627ebe-90, col_values=(('external_ids', {'iface-id': '694a3f4b-a908-40fb-abbe-4687074d8093'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:45:44 compute-0 nova_compute[189265]: 2025-09-30 07:45:44.799 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:45:44 compute-0 nova_compute[189265]: 2025-09-30 07:45:44.800 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:45:44 compute-0 ovn_controller[91436]: 2025-09-30T07:45:44Z|00274|binding|INFO|Releasing lport 694a3f4b-a908-40fb-abbe-4687074d8093 from this chassis (sb_readonly=0)
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:45:44.801 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[1debac9f-4152-4cb3-91f0-4c8961ccebeb]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:45:44.802 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b5627ebe-9328-432f-88fb-b5b539662efd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b5627ebe-9328-432f-88fb-b5b539662efd.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:45:44.803 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b5627ebe-9328-432f-88fb-b5b539662efd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b5627ebe-9328-432f-88fb-b5b539662efd.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:45:44.803 100322 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for b5627ebe-9328-432f-88fb-b5b539662efd disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:45:44.803 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b5627ebe-9328-432f-88fb-b5b539662efd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b5627ebe-9328-432f-88fb-b5b539662efd.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:45:44.804 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[7c524fe3-7970-49dd-92d1-ed3e3171c4bb]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:45:44.805 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b5627ebe-9328-432f-88fb-b5b539662efd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b5627ebe-9328-432f-88fb-b5b539662efd.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:45:44.805 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[ea2c6a27-5115-4efd-88ab-c08b19e2d990]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:45:44.806 100322 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]: global
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]:     log         /dev/log local0 debug
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]:     log-tag     haproxy-metadata-proxy-b5627ebe-9328-432f-88fb-b5b539662efd
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]:     user        root
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]:     group       root
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]:     maxconn     1024
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]:     pidfile     /var/lib/neutron/external/pids/b5627ebe-9328-432f-88fb-b5b539662efd.pid.haproxy
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]:     daemon
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]: 
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]: defaults
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]:     log global
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]:     mode http
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]:     option httplog
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]:     option dontlognull
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]:     option http-server-close
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]:     option forwardfor
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]:     retries                 3
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]:     timeout http-request    30s
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]:     timeout connect         30s
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]:     timeout client          32s
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]:     timeout server          32s
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]:     timeout http-keep-alive 30s
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]: 
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]: listen listener
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]:     bind 169.254.169.254:80
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]:     
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]: 
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]:     http-request add-header X-OVN-Network-ID b5627ebe-9328-432f-88fb-b5b539662efd
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Sep 30 07:45:44 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:45:44.808 100322 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b5627ebe-9328-432f-88fb-b5b539662efd', 'env', 'PROCESS_TAG=haproxy-b5627ebe-9328-432f-88fb-b5b539662efd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b5627ebe-9328-432f-88fb-b5b539662efd.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Sep 30 07:45:44 compute-0 nova_compute[189265]: 2025-09-30 07:45:44.814 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:45:44 compute-0 sshd-session[224731]: Failed password for root from 193.46.255.159 port 42762 ssh2
Sep 30 07:45:45 compute-0 podman[224965]: 2025-09-30 07:45:45.281579561 +0000 UTC m=+0.087374683 container create cc3bb7e380b9164f72c19683c3433c3ff66240db0783705f12eaceb438a72296 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-b5627ebe-9328-432f-88fb-b5b539662efd, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Sep 30 07:45:45 compute-0 systemd[1]: Started libpod-conmon-cc3bb7e380b9164f72c19683c3433c3ff66240db0783705f12eaceb438a72296.scope.
Sep 30 07:45:45 compute-0 podman[224965]: 2025-09-30 07:45:45.240459174 +0000 UTC m=+0.046254346 image pull eeebcc09bc72f81ab45f5ab87eb8f6a7b554b949227aeec082bdb0732754ddc8 38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 07:45:45 compute-0 systemd[1]: Started libcrun container.
Sep 30 07:45:45 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/22496b6aa35e3410afccba630c33cb4c74e8c2e2050cb1ff5da48f8a15c9e6b9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 07:45:45 compute-0 podman[224965]: 2025-09-30 07:45:45.370761825 +0000 UTC m=+0.176556957 container init cc3bb7e380b9164f72c19683c3433c3ff66240db0783705f12eaceb438a72296 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-b5627ebe-9328-432f-88fb-b5b539662efd, tcib_build_tag=watcher_latest, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 07:45:45 compute-0 podman[224965]: 2025-09-30 07:45:45.376854311 +0000 UTC m=+0.182649413 container start cc3bb7e380b9164f72c19683c3433c3ff66240db0783705f12eaceb438a72296 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-b5627ebe-9328-432f-88fb-b5b539662efd, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Sep 30 07:45:45 compute-0 neutron-haproxy-ovnmeta-b5627ebe-9328-432f-88fb-b5b539662efd[224979]: [NOTICE]   (224983) : New worker (224985) forked
Sep 30 07:45:45 compute-0 neutron-haproxy-ovnmeta-b5627ebe-9328-432f-88fb-b5b539662efd[224979]: [NOTICE]   (224983) : Loading success.
Sep 30 07:45:45 compute-0 sshd[124648]: drop connection #1 from [52.224.109.126]:54804 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:46 compute-0 sshd[124648]: drop connection #1 from [52.224.109.126]:54818 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:46 compute-0 sshd-session[224731]: Received disconnect from 193.46.255.159 port 42762:11:  [preauth]
Sep 30 07:45:46 compute-0 sshd-session[224731]: Disconnected from authenticating user root 193.46.255.159 port 42762 [preauth]
Sep 30 07:45:46 compute-0 sshd-session[224731]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.159  user=root
Sep 30 07:45:47 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:45:47.313 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=32, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '1a:26:7c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2e:60:fa:91:d0:34'}, ipsec=False) old=SB_Global(nb_cfg=31) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:45:47 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:45:47.314 100322 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 07:45:47 compute-0 nova_compute[189265]: 2025-09-30 07:45:47.315 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:45:47 compute-0 ovn_controller[91436]: 2025-09-30T07:45:47Z|00275|binding|INFO|Claiming lport 62fc82a5-e7b3-4c47-be06-d9404d5372a7 for this chassis.
Sep 30 07:45:47 compute-0 ovn_controller[91436]: 2025-09-30T07:45:47Z|00276|binding|INFO|62fc82a5-e7b3-4c47-be06-d9404d5372a7: Claiming fa:16:3e:2a:58:1f 10.100.0.10
Sep 30 07:45:47 compute-0 ovn_controller[91436]: 2025-09-30T07:45:47Z|00277|binding|INFO|Setting lport 62fc82a5-e7b3-4c47-be06-d9404d5372a7 up in Southbound
Sep 30 07:45:47 compute-0 sshd[124648]: drop connection #1 from [52.224.109.126]:54834 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:47 compute-0 unix_chkpwd[225009]: password check failed for user (root)
Sep 30 07:45:47 compute-0 sshd-session[225006]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.159  user=root
Sep 30 07:45:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:45:48.315 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=01429670-4ea1-4dab-babc-4bc628cc01bb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '32'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:45:48 compute-0 sshd[124648]: drop connection #1 from [52.224.109.126]:54850 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:48 compute-0 nova_compute[189265]: 2025-09-30 07:45:48.425 2 INFO nova.compute.manager [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7e80d397-6a79-43e6-b663-fca5437ca2bb] Post operation of migration started
Sep 30 07:45:48 compute-0 nova_compute[189265]: 2025-09-30 07:45:48.425 2 WARNING neutronclient.v2_0.client [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:45:48 compute-0 nova_compute[189265]: 2025-09-30 07:45:48.534 2 WARNING neutronclient.v2_0.client [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:45:48 compute-0 nova_compute[189265]: 2025-09-30 07:45:48.534 2 WARNING neutronclient.v2_0.client [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:45:49 compute-0 sshd[124648]: drop connection #1 from [52.224.109.126]:54866 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:49 compute-0 nova_compute[189265]: 2025-09-30 07:45:49.217 2 DEBUG oslo_concurrency.lockutils [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "refresh_cache-7e80d397-6a79-43e6-b663-fca5437ca2bb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:45:49 compute-0 nova_compute[189265]: 2025-09-30 07:45:49.217 2 DEBUG oslo_concurrency.lockutils [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquired lock "refresh_cache-7e80d397-6a79-43e6-b663-fca5437ca2bb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:45:49 compute-0 nova_compute[189265]: 2025-09-30 07:45:49.218 2 DEBUG nova.network.neutron [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7e80d397-6a79-43e6-b663-fca5437ca2bb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 07:45:49 compute-0 sshd-session[225006]: Failed password for root from 193.46.255.159 port 50510 ssh2
Sep 30 07:45:49 compute-0 nova_compute[189265]: 2025-09-30 07:45:49.430 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:45:49 compute-0 nova_compute[189265]: 2025-09-30 07:45:49.727 2 WARNING neutronclient.v2_0.client [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:45:49 compute-0 nova_compute[189265]: 2025-09-30 07:45:49.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:45:49 compute-0 unix_chkpwd[225010]: password check failed for user (root)
Sep 30 07:45:50 compute-0 sshd[124648]: drop connection #1 from [52.224.109.126]:54880 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:50 compute-0 nova_compute[189265]: 2025-09-30 07:45:50.549 2 WARNING neutronclient.v2_0.client [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:45:50 compute-0 nova_compute[189265]: 2025-09-30 07:45:50.700 2 DEBUG nova.network.neutron [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7e80d397-6a79-43e6-b663-fca5437ca2bb] Updating instance_info_cache with network_info: [{"id": "62fc82a5-e7b3-4c47-be06-d9404d5372a7", "address": "fa:16:3e:2a:58:1f", "network": {"id": "b5627ebe-9328-432f-88fb-b5b539662efd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-337845770-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6633f775c5d46dc9c6c213b63954b2f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62fc82a5-e7", "ovs_interfaceid": "62fc82a5-e7b3-4c47-be06-d9404d5372a7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:45:51 compute-0 sshd[124648]: drop connection #1 from [52.224.109.126]:54884 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:51 compute-0 nova_compute[189265]: 2025-09-30 07:45:51.208 2 DEBUG oslo_concurrency.lockutils [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Releasing lock "refresh_cache-7e80d397-6a79-43e6-b663-fca5437ca2bb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:45:51 compute-0 nova_compute[189265]: 2025-09-30 07:45:51.733 2 DEBUG oslo_concurrency.lockutils [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:45:51 compute-0 nova_compute[189265]: 2025-09-30 07:45:51.734 2 DEBUG oslo_concurrency.lockutils [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:45:51 compute-0 nova_compute[189265]: 2025-09-30 07:45:51.734 2 DEBUG oslo_concurrency.lockutils [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:45:51 compute-0 nova_compute[189265]: 2025-09-30 07:45:51.738 2 INFO nova.virt.libvirt.driver [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7e80d397-6a79-43e6-b663-fca5437ca2bb] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Sep 30 07:45:51 compute-0 virtqemud[189090]: Domain id=23 name='instance-0000001c' uuid=7e80d397-6a79-43e6-b663-fca5437ca2bb is tainted: custom-monitor
Sep 30 07:45:52 compute-0 sshd[124648]: drop connection #1 from [52.224.109.126]:54894 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:52 compute-0 sshd-session[225006]: Failed password for root from 193.46.255.159 port 50510 ssh2
Sep 30 07:45:52 compute-0 nova_compute[189265]: 2025-09-30 07:45:52.746 2 INFO nova.virt.libvirt.driver [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7e80d397-6a79-43e6-b663-fca5437ca2bb] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Sep 30 07:45:52 compute-0 sshd[124648]: drop connection #1 from [52.224.109.126]:54902 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:53 compute-0 nova_compute[189265]: 2025-09-30 07:45:53.752 2 INFO nova.virt.libvirt.driver [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7e80d397-6a79-43e6-b663-fca5437ca2bb] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Sep 30 07:45:53 compute-0 nova_compute[189265]: 2025-09-30 07:45:53.758 2 DEBUG nova.compute.manager [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7e80d397-6a79-43e6-b663-fca5437ca2bb] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 07:45:53 compute-0 nova_compute[189265]: 2025-09-30 07:45:53.783 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:45:53 compute-0 sshd[124648]: drop connection #2 from [52.224.109.126]:51960 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:53 compute-0 unix_chkpwd[225012]: password check failed for user (root)
Sep 30 07:45:54 compute-0 nova_compute[189265]: 2025-09-30 07:45:54.270 2 DEBUG nova.objects.instance [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7e80d397-6a79-43e6-b663-fca5437ca2bb] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Sep 30 07:45:54 compute-0 nova_compute[189265]: 2025-09-30 07:45:54.433 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:45:54 compute-0 nova_compute[189265]: 2025-09-30 07:45:54.774 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:45:54 compute-0 sshd[124648]: drop connection #2 from [52.224.109.126]:51964 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:55 compute-0 nova_compute[189265]: 2025-09-30 07:45:55.294 2 WARNING neutronclient.v2_0.client [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:45:55 compute-0 nova_compute[189265]: 2025-09-30 07:45:55.365 2 WARNING neutronclient.v2_0.client [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:45:55 compute-0 nova_compute[189265]: 2025-09-30 07:45:55.366 2 WARNING neutronclient.v2_0.client [None req-895fb865-77a6-4e26-aa9a-b33a01eb6d6b e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:45:55 compute-0 sshd-session[225011]: Invalid user prueba from 80.94.95.115 port 48974
Sep 30 07:45:55 compute-0 sshd[124648]: drop connection #2 from [52.224.109.126]:51978 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:55 compute-0 nova_compute[189265]: 2025-09-30 07:45:55.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:45:55 compute-0 sshd-session[225011]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:45:55 compute-0 sshd-session[225011]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.95.115
Sep 30 07:45:56 compute-0 podman[225014]: 2025-09-30 07:45:56.013137992 +0000 UTC m=+0.100781489 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 07:45:56 compute-0 sshd-session[225006]: Failed password for root from 193.46.255.159 port 50510 ssh2
Sep 30 07:45:56 compute-0 sshd[124648]: drop connection #2 from [52.224.109.126]:51994 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:56 compute-0 nova_compute[189265]: 2025-09-30 07:45:56.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:45:57 compute-0 sshd[124648]: drop connection #2 from [52.224.109.126]:52002 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:57 compute-0 sshd-session[225011]: Failed password for invalid user prueba from 80.94.95.115 port 48974 ssh2
Sep 30 07:45:58 compute-0 sshd-session[225006]: Received disconnect from 193.46.255.159 port 50510:11:  [preauth]
Sep 30 07:45:58 compute-0 sshd-session[225006]: Disconnected from authenticating user root 193.46.255.159 port 50510 [preauth]
Sep 30 07:45:58 compute-0 sshd-session[225006]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.159  user=root
Sep 30 07:45:58 compute-0 sshd-session[225011]: Connection closed by invalid user prueba 80.94.95.115 port 48974 [preauth]
Sep 30 07:45:58 compute-0 sshd[124648]: drop connection #1 from [52.224.109.126]:52006 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:58 compute-0 unix_chkpwd[225041]: password check failed for user (root)
Sep 30 07:45:58 compute-0 sshd-session[225039]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.159  user=root
Sep 30 07:45:59 compute-0 sshd[124648]: drop connection #1 from [52.224.109.126]:52012 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:45:59 compute-0 nova_compute[189265]: 2025-09-30 07:45:59.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:45:59 compute-0 podman[199733]: time="2025-09-30T07:45:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:45:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:45:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 07:45:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:45:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3479 "" "Go-http-client/1.1"
Sep 30 07:45:59 compute-0 nova_compute[189265]: 2025-09-30 07:45:59.776 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:46:00 compute-0 sshd[124648]: drop connection #1 from [52.224.109.126]:52024 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:46:00 compute-0 sshd-session[225039]: Failed password for root from 193.46.255.159 port 53270 ssh2
Sep 30 07:46:00 compute-0 nova_compute[189265]: 2025-09-30 07:46:00.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:46:00 compute-0 nova_compute[189265]: 2025-09-30 07:46:00.788 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:46:01 compute-0 unix_chkpwd[225042]: password check failed for user (root)
Sep 30 07:46:01 compute-0 sshd[124648]: drop connection #1 from [52.224.109.126]:52036 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:46:01 compute-0 openstack_network_exporter[201859]: ERROR   07:46:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:46:01 compute-0 openstack_network_exporter[201859]: ERROR   07:46:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:46:01 compute-0 openstack_network_exporter[201859]: ERROR   07:46:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:46:01 compute-0 openstack_network_exporter[201859]: ERROR   07:46:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:46:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:46:01 compute-0 openstack_network_exporter[201859]: ERROR   07:46:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:46:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:46:02 compute-0 sshd[124648]: drop connection #1 from [52.224.109.126]:52042 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:46:03 compute-0 sshd[124648]: drop connection #1 from [52.224.109.126]:56290 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:46:03 compute-0 sshd-session[225039]: Failed password for root from 193.46.255.159 port 53270 ssh2
Sep 30 07:46:04 compute-0 sshd[124648]: drop connection #1 from [52.224.109.126]:56298 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:46:04 compute-0 nova_compute[189265]: 2025-09-30 07:46:04.437 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:46:04 compute-0 nova_compute[189265]: 2025-09-30 07:46:04.778 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:46:04 compute-0 nova_compute[189265]: 2025-09-30 07:46:04.986 2 DEBUG oslo_concurrency.lockutils [None req-0d687846-ce8f-4520-b7ae-1a6613cbea39 bf911d50b77e4e20a250e642038c8043 dd45f15fbdba414c8d395e5ff149cbc4 - - default default] Acquiring lock "7e80d397-6a79-43e6-b663-fca5437ca2bb" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:46:04 compute-0 nova_compute[189265]: 2025-09-30 07:46:04.986 2 DEBUG oslo_concurrency.lockutils [None req-0d687846-ce8f-4520-b7ae-1a6613cbea39 bf911d50b77e4e20a250e642038c8043 dd45f15fbdba414c8d395e5ff149cbc4 - - default default] Lock "7e80d397-6a79-43e6-b663-fca5437ca2bb" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:46:04 compute-0 nova_compute[189265]: 2025-09-30 07:46:04.987 2 DEBUG oslo_concurrency.lockutils [None req-0d687846-ce8f-4520-b7ae-1a6613cbea39 bf911d50b77e4e20a250e642038c8043 dd45f15fbdba414c8d395e5ff149cbc4 - - default default] Acquiring lock "7e80d397-6a79-43e6-b663-fca5437ca2bb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:46:04 compute-0 nova_compute[189265]: 2025-09-30 07:46:04.987 2 DEBUG oslo_concurrency.lockutils [None req-0d687846-ce8f-4520-b7ae-1a6613cbea39 bf911d50b77e4e20a250e642038c8043 dd45f15fbdba414c8d395e5ff149cbc4 - - default default] Lock "7e80d397-6a79-43e6-b663-fca5437ca2bb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:46:04 compute-0 nova_compute[189265]: 2025-09-30 07:46:04.988 2 DEBUG oslo_concurrency.lockutils [None req-0d687846-ce8f-4520-b7ae-1a6613cbea39 bf911d50b77e4e20a250e642038c8043 dd45f15fbdba414c8d395e5ff149cbc4 - - default default] Lock "7e80d397-6a79-43e6-b663-fca5437ca2bb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:46:05 compute-0 nova_compute[189265]: 2025-09-30 07:46:05.006 2 INFO nova.compute.manager [None req-0d687846-ce8f-4520-b7ae-1a6613cbea39 bf911d50b77e4e20a250e642038c8043 dd45f15fbdba414c8d395e5ff149cbc4 - - default default] [instance: 7e80d397-6a79-43e6-b663-fca5437ca2bb] Terminating instance
Sep 30 07:46:05 compute-0 sshd[124648]: drop connection #1 from [52.224.109.126]:56314 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:46:05 compute-0 unix_chkpwd[225043]: password check failed for user (root)
Sep 30 07:46:05 compute-0 nova_compute[189265]: 2025-09-30 07:46:05.526 2 DEBUG nova.compute.manager [None req-0d687846-ce8f-4520-b7ae-1a6613cbea39 bf911d50b77e4e20a250e642038c8043 dd45f15fbdba414c8d395e5ff149cbc4 - - default default] [instance: 7e80d397-6a79-43e6-b663-fca5437ca2bb] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 07:46:05 compute-0 kernel: tap62fc82a5-e7 (unregistering): left promiscuous mode
Sep 30 07:46:05 compute-0 NetworkManager[51813]: <info>  [1759218365.5535] device (tap62fc82a5-e7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 07:46:05 compute-0 nova_compute[189265]: 2025-09-30 07:46:05.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:46:05 compute-0 ovn_controller[91436]: 2025-09-30T07:46:05Z|00278|binding|INFO|Releasing lport 62fc82a5-e7b3-4c47-be06-d9404d5372a7 from this chassis (sb_readonly=0)
Sep 30 07:46:05 compute-0 ovn_controller[91436]: 2025-09-30T07:46:05Z|00279|binding|INFO|Setting lport 62fc82a5-e7b3-4c47-be06-d9404d5372a7 down in Southbound
Sep 30 07:46:05 compute-0 ovn_controller[91436]: 2025-09-30T07:46:05Z|00280|binding|INFO|Removing iface tap62fc82a5-e7 ovn-installed in OVS
Sep 30 07:46:05 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:46:05.566 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2a:58:1f 10.100.0.10'], port_security=['fa:16:3e:2a:58:1f 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '7e80d397-6a79-43e6-b663-fca5437ca2bb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b5627ebe-9328-432f-88fb-b5b539662efd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dd45f15fbdba414c8d395e5ff149cbc4', 'neutron:revision_number': '15', 'neutron:security_group_ids': '3b4f5d53-fbd5-497e-8555-6b61b9a4d332', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d4956bab-94e0-436a-b508-eeb3061671e6, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], logical_port=62fc82a5-e7b3-4c47-be06-d9404d5372a7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:46:05 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:46:05.566 100322 INFO neutron.agent.ovn.metadata.agent [-] Port 62fc82a5-e7b3-4c47-be06-d9404d5372a7 in datapath b5627ebe-9328-432f-88fb-b5b539662efd unbound from our chassis
Sep 30 07:46:05 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:46:05.568 100322 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b5627ebe-9328-432f-88fb-b5b539662efd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 07:46:05 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:46:05.572 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[da84dc21-b056-4e77-8642-7d9c14f7d957]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:46:05 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:46:05.573 100322 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b5627ebe-9328-432f-88fb-b5b539662efd namespace which is not needed anymore
Sep 30 07:46:05 compute-0 nova_compute[189265]: 2025-09-30 07:46:05.583 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:46:05 compute-0 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d0000001c.scope: Deactivated successfully.
Sep 30 07:46:05 compute-0 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d0000001c.scope: Consumed 2.080s CPU time.
Sep 30 07:46:05 compute-0 systemd-machined[149233]: Machine qemu-23-instance-0000001c terminated.
Sep 30 07:46:05 compute-0 podman[225046]: 2025-09-30 07:46:05.673838691 +0000 UTC m=+0.073226805 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.schema-version=1.0, container_name=iscsid, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 07:46:05 compute-0 neutron-haproxy-ovnmeta-b5627ebe-9328-432f-88fb-b5b539662efd[224979]: [NOTICE]   (224983) : haproxy version is 3.0.5-8e879a5
Sep 30 07:46:05 compute-0 neutron-haproxy-ovnmeta-b5627ebe-9328-432f-88fb-b5b539662efd[224979]: [NOTICE]   (224983) : path to executable is /usr/sbin/haproxy
Sep 30 07:46:05 compute-0 neutron-haproxy-ovnmeta-b5627ebe-9328-432f-88fb-b5b539662efd[224979]: [WARNING]  (224983) : Exiting Master process...
Sep 30 07:46:05 compute-0 podman[225078]: 2025-09-30 07:46:05.684330034 +0000 UTC m=+0.034048154 container kill cc3bb7e380b9164f72c19683c3433c3ff66240db0783705f12eaceb438a72296 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-b5627ebe-9328-432f-88fb-b5b539662efd, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Sep 30 07:46:05 compute-0 neutron-haproxy-ovnmeta-b5627ebe-9328-432f-88fb-b5b539662efd[224979]: [ALERT]    (224983) : Current worker (224985) exited with code 143 (Terminated)
Sep 30 07:46:05 compute-0 neutron-haproxy-ovnmeta-b5627ebe-9328-432f-88fb-b5b539662efd[224979]: [WARNING]  (224983) : All workers exited. Exiting... (0)
Sep 30 07:46:05 compute-0 systemd[1]: libpod-cc3bb7e380b9164f72c19683c3433c3ff66240db0783705f12eaceb438a72296.scope: Deactivated successfully.
Sep 30 07:46:05 compute-0 nova_compute[189265]: 2025-09-30 07:46:05.690 2 DEBUG nova.compute.manager [req-1e3de69e-a0ed-4114-a667-a1437f08b3c3 req-8a6d230e-158c-4670-a145-fb058c23992e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7e80d397-6a79-43e6-b663-fca5437ca2bb] Received event network-vif-unplugged-62fc82a5-e7b3-4c47-be06-d9404d5372a7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:46:05 compute-0 nova_compute[189265]: 2025-09-30 07:46:05.690 2 DEBUG oslo_concurrency.lockutils [req-1e3de69e-a0ed-4114-a667-a1437f08b3c3 req-8a6d230e-158c-4670-a145-fb058c23992e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "7e80d397-6a79-43e6-b663-fca5437ca2bb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:46:05 compute-0 nova_compute[189265]: 2025-09-30 07:46:05.691 2 DEBUG oslo_concurrency.lockutils [req-1e3de69e-a0ed-4114-a667-a1437f08b3c3 req-8a6d230e-158c-4670-a145-fb058c23992e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "7e80d397-6a79-43e6-b663-fca5437ca2bb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:46:05 compute-0 nova_compute[189265]: 2025-09-30 07:46:05.691 2 DEBUG oslo_concurrency.lockutils [req-1e3de69e-a0ed-4114-a667-a1437f08b3c3 req-8a6d230e-158c-4670-a145-fb058c23992e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "7e80d397-6a79-43e6-b663-fca5437ca2bb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:46:05 compute-0 nova_compute[189265]: 2025-09-30 07:46:05.691 2 DEBUG nova.compute.manager [req-1e3de69e-a0ed-4114-a667-a1437f08b3c3 req-8a6d230e-158c-4670-a145-fb058c23992e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7e80d397-6a79-43e6-b663-fca5437ca2bb] No waiting events found dispatching network-vif-unplugged-62fc82a5-e7b3-4c47-be06-d9404d5372a7 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:46:05 compute-0 nova_compute[189265]: 2025-09-30 07:46:05.691 2 DEBUG nova.compute.manager [req-1e3de69e-a0ed-4114-a667-a1437f08b3c3 req-8a6d230e-158c-4670-a145-fb058c23992e 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7e80d397-6a79-43e6-b663-fca5437ca2bb] Received event network-vif-unplugged-62fc82a5-e7b3-4c47-be06-d9404d5372a7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:46:05 compute-0 podman[225102]: 2025-09-30 07:46:05.728985683 +0000 UTC m=+0.028202035 container died cc3bb7e380b9164f72c19683c3433c3ff66240db0783705f12eaceb438a72296 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-b5627ebe-9328-432f-88fb-b5b539662efd, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Sep 30 07:46:05 compute-0 nova_compute[189265]: 2025-09-30 07:46:05.746 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:46:05 compute-0 nova_compute[189265]: 2025-09-30 07:46:05.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:46:05 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cc3bb7e380b9164f72c19683c3433c3ff66240db0783705f12eaceb438a72296-userdata-shm.mount: Deactivated successfully.
Sep 30 07:46:05 compute-0 systemd[1]: var-lib-containers-storage-overlay-22496b6aa35e3410afccba630c33cb4c74e8c2e2050cb1ff5da48f8a15c9e6b9-merged.mount: Deactivated successfully.
Sep 30 07:46:05 compute-0 nova_compute[189265]: 2025-09-30 07:46:05.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:46:05 compute-0 nova_compute[189265]: 2025-09-30 07:46:05.791 2 INFO nova.virt.libvirt.driver [-] [instance: 7e80d397-6a79-43e6-b663-fca5437ca2bb] Instance destroyed successfully.
Sep 30 07:46:05 compute-0 nova_compute[189265]: 2025-09-30 07:46:05.791 2 DEBUG nova.objects.instance [None req-0d687846-ce8f-4520-b7ae-1a6613cbea39 bf911d50b77e4e20a250e642038c8043 dd45f15fbdba414c8d395e5ff149cbc4 - - default default] Lazy-loading 'resources' on Instance uuid 7e80d397-6a79-43e6-b663-fca5437ca2bb obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:46:05 compute-0 podman[225102]: 2025-09-30 07:46:05.814770159 +0000 UTC m=+0.113986491 container cleanup cc3bb7e380b9164f72c19683c3433c3ff66240db0783705f12eaceb438a72296 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-b5627ebe-9328-432f-88fb-b5b539662efd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4)
Sep 30 07:46:05 compute-0 systemd[1]: libpod-conmon-cc3bb7e380b9164f72c19683c3433c3ff66240db0783705f12eaceb438a72296.scope: Deactivated successfully.
Sep 30 07:46:05 compute-0 podman[225108]: 2025-09-30 07:46:05.87956282 +0000 UTC m=+0.166492357 container remove cc3bb7e380b9164f72c19683c3433c3ff66240db0783705f12eaceb438a72296 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-b5627ebe-9328-432f-88fb-b5b539662efd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Sep 30 07:46:05 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:46:05.887 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[a3b0b68a-222c-49d0-b5ab-6397fd121788]: (4, ("Tue Sep 30 07:46:05 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-b5627ebe-9328-432f-88fb-b5b539662efd (cc3bb7e380b9164f72c19683c3433c3ff66240db0783705f12eaceb438a72296)\ncc3bb7e380b9164f72c19683c3433c3ff66240db0783705f12eaceb438a72296\nTue Sep 30 07:46:05 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-b5627ebe-9328-432f-88fb-b5b539662efd (cc3bb7e380b9164f72c19683c3433c3ff66240db0783705f12eaceb438a72296)\ncc3bb7e380b9164f72c19683c3433c3ff66240db0783705f12eaceb438a72296\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:46:05 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:46:05.888 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[66b08be2-fa05-4ab4-9573-288ff63faeaf]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:46:05 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:46:05.889 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b5627ebe-9328-432f-88fb-b5b539662efd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b5627ebe-9328-432f-88fb-b5b539662efd.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:46:05 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:46:05.890 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[1e24cafa-bbf1-4197-be21-da6e0a83bca5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:46:05 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:46:05.890 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5627ebe-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:46:05 compute-0 nova_compute[189265]: 2025-09-30 07:46:05.892 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:46:05 compute-0 kernel: tapb5627ebe-90: left promiscuous mode
Sep 30 07:46:05 compute-0 nova_compute[189265]: 2025-09-30 07:46:05.912 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:46:05 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:46:05.914 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[cfb9d960-deaf-4c58-a757-f7786ea5396f]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:46:05 compute-0 sshd[124648]: drop connection #1 from [52.224.109.126]:56326 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:46:05 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:46:05.946 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[4e8d698e-eb1e-4f38-b1ba-2ffeed1cf729]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:46:05 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:46:05.947 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[bf607b81-836b-44a5-b182-cca9bd979394]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:46:05 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:46:05.967 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[dd6aa85a-2d8b-4b8f-888f-7cc660bba77a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 620222, 'reachable_time': 19140, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225151, 'error': None, 'target': 'ovnmeta-b5627ebe-9328-432f-88fb-b5b539662efd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:46:05 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:46:05.970 100440 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b5627ebe-9328-432f-88fb-b5b539662efd deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Sep 30 07:46:05 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:46:05.970 100440 DEBUG oslo.privsep.daemon [-] privsep: reply[fd8c1ef3-4cc9-45bc-9ca8-5e59c97e4c9d]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:46:05 compute-0 systemd[1]: run-netns-ovnmeta\x2db5627ebe\x2d9328\x2d432f\x2d88fb\x2db5b539662efd.mount: Deactivated successfully.
Sep 30 07:46:06 compute-0 nova_compute[189265]: 2025-09-30 07:46:06.298 2 DEBUG nova.virt.libvirt.vif [None req-0d687846-ce8f-4520-b7ae-1a6613cbea39 bf911d50b77e4e20a250e642038c8043 dd45f15fbdba414c8d395e5ff149cbc4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-09-30T07:44:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-1460458105',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-1460458105',id=28,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T07:44:50Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='dd45f15fbdba414c8d395e5ff149cbc4',ramdisk_id='',reservation_id='r-0xxy3odk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',clean_attempts='1',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-359850667',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-359850667-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T07:45:54Z,user_data=None,user_id='bf911d50b77e4e20a250e642038c8043',uuid=7e80d397-6a79-43e6-b663-fca5437ca2bb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "62fc82a5-e7b3-4c47-be06-d9404d5372a7", "address": "fa:16:3e:2a:58:1f", "network": {"id": "b5627ebe-9328-432f-88fb-b5b539662efd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-337845770-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6633f775c5d46dc9c6c213b63954b2f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62fc82a5-e7", "ovs_interfaceid": "62fc82a5-e7b3-4c47-be06-d9404d5372a7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 07:46:06 compute-0 nova_compute[189265]: 2025-09-30 07:46:06.298 2 DEBUG nova.network.os_vif_util [None req-0d687846-ce8f-4520-b7ae-1a6613cbea39 bf911d50b77e4e20a250e642038c8043 dd45f15fbdba414c8d395e5ff149cbc4 - - default default] Converting VIF {"id": "62fc82a5-e7b3-4c47-be06-d9404d5372a7", "address": "fa:16:3e:2a:58:1f", "network": {"id": "b5627ebe-9328-432f-88fb-b5b539662efd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-337845770-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6633f775c5d46dc9c6c213b63954b2f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62fc82a5-e7", "ovs_interfaceid": "62fc82a5-e7b3-4c47-be06-d9404d5372a7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:46:06 compute-0 nova_compute[189265]: 2025-09-30 07:46:06.299 2 DEBUG nova.network.os_vif_util [None req-0d687846-ce8f-4520-b7ae-1a6613cbea39 bf911d50b77e4e20a250e642038c8043 dd45f15fbdba414c8d395e5ff149cbc4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:2a:58:1f,bridge_name='br-int',has_traffic_filtering=True,id=62fc82a5-e7b3-4c47-be06-d9404d5372a7,network=Network(b5627ebe-9328-432f-88fb-b5b539662efd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap62fc82a5-e7') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:46:06 compute-0 nova_compute[189265]: 2025-09-30 07:46:06.299 2 DEBUG os_vif [None req-0d687846-ce8f-4520-b7ae-1a6613cbea39 bf911d50b77e4e20a250e642038c8043 dd45f15fbdba414c8d395e5ff149cbc4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:2a:58:1f,bridge_name='br-int',has_traffic_filtering=True,id=62fc82a5-e7b3-4c47-be06-d9404d5372a7,network=Network(b5627ebe-9328-432f-88fb-b5b539662efd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap62fc82a5-e7') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 07:46:06 compute-0 nova_compute[189265]: 2025-09-30 07:46:06.302 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:46:06 compute-0 nova_compute[189265]: 2025-09-30 07:46:06.303 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:46:06 compute-0 nova_compute[189265]: 2025-09-30 07:46:06.303 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:46:06 compute-0 nova_compute[189265]: 2025-09-30 07:46:06.303 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:46:06 compute-0 nova_compute[189265]: 2025-09-30 07:46:06.304 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:46:06 compute-0 nova_compute[189265]: 2025-09-30 07:46:06.304 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62fc82a5-e7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:46:06 compute-0 nova_compute[189265]: 2025-09-30 07:46:06.350 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:46:06 compute-0 nova_compute[189265]: 2025-09-30 07:46:06.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:46:06 compute-0 nova_compute[189265]: 2025-09-30 07:46:06.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:46:06 compute-0 nova_compute[189265]: 2025-09-30 07:46:06.353 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=a1a40aed-0a7d-4c50-97dd-d9d5fa41c72d) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:46:06 compute-0 nova_compute[189265]: 2025-09-30 07:46:06.354 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:46:06 compute-0 nova_compute[189265]: 2025-09-30 07:46:06.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:46:06 compute-0 nova_compute[189265]: 2025-09-30 07:46:06.357 2 INFO os_vif [None req-0d687846-ce8f-4520-b7ae-1a6613cbea39 bf911d50b77e4e20a250e642038c8043 dd45f15fbdba414c8d395e5ff149cbc4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:2a:58:1f,bridge_name='br-int',has_traffic_filtering=True,id=62fc82a5-e7b3-4c47-be06-d9404d5372a7,network=Network(b5627ebe-9328-432f-88fb-b5b539662efd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap62fc82a5-e7')
Sep 30 07:46:06 compute-0 nova_compute[189265]: 2025-09-30 07:46:06.358 2 INFO nova.virt.libvirt.driver [None req-0d687846-ce8f-4520-b7ae-1a6613cbea39 bf911d50b77e4e20a250e642038c8043 dd45f15fbdba414c8d395e5ff149cbc4 - - default default] [instance: 7e80d397-6a79-43e6-b663-fca5437ca2bb] Deleting instance files /var/lib/nova/instances/7e80d397-6a79-43e6-b663-fca5437ca2bb_del
Sep 30 07:46:06 compute-0 nova_compute[189265]: 2025-09-30 07:46:06.358 2 INFO nova.virt.libvirt.driver [None req-0d687846-ce8f-4520-b7ae-1a6613cbea39 bf911d50b77e4e20a250e642038c8043 dd45f15fbdba414c8d395e5ff149cbc4 - - default default] [instance: 7e80d397-6a79-43e6-b663-fca5437ca2bb] Deletion of /var/lib/nova/instances/7e80d397-6a79-43e6-b663-fca5437ca2bb_del complete
Sep 30 07:46:06 compute-0 nova_compute[189265]: 2025-09-30 07:46:06.873 2 INFO nova.compute.manager [None req-0d687846-ce8f-4520-b7ae-1a6613cbea39 bf911d50b77e4e20a250e642038c8043 dd45f15fbdba414c8d395e5ff149cbc4 - - default default] [instance: 7e80d397-6a79-43e6-b663-fca5437ca2bb] Took 1.35 seconds to destroy the instance on the hypervisor.
Sep 30 07:46:06 compute-0 nova_compute[189265]: 2025-09-30 07:46:06.873 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-0d687846-ce8f-4520-b7ae-1a6613cbea39 bf911d50b77e4e20a250e642038c8043 dd45f15fbdba414c8d395e5ff149cbc4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 07:46:06 compute-0 nova_compute[189265]: 2025-09-30 07:46:06.874 2 DEBUG nova.compute.manager [-] [instance: 7e80d397-6a79-43e6-b663-fca5437ca2bb] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 07:46:06 compute-0 nova_compute[189265]: 2025-09-30 07:46:06.874 2 DEBUG nova.network.neutron [-] [instance: 7e80d397-6a79-43e6-b663-fca5437ca2bb] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 07:46:06 compute-0 nova_compute[189265]: 2025-09-30 07:46:06.875 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:46:06 compute-0 sshd[124648]: drop connection #1 from [52.224.109.126]:56338 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:46:07 compute-0 nova_compute[189265]: 2025-09-30 07:46:07.249 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:46:07 compute-0 sshd-session[225039]: Failed password for root from 193.46.255.159 port 53270 ssh2
Sep 30 07:46:07 compute-0 nova_compute[189265]: 2025-09-30 07:46:07.396 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Error from libvirt while getting description of instance-0000001c: [Error Code 42] Domain not found: no domain with matching uuid '7e80d397-6a79-43e6-b663-fca5437ca2bb' (instance-0000001c): libvirt.libvirtError: Domain not found: no domain with matching uuid '7e80d397-6a79-43e6-b663-fca5437ca2bb' (instance-0000001c)
Sep 30 07:46:07 compute-0 nova_compute[189265]: 2025-09-30 07:46:07.580 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:46:07 compute-0 nova_compute[189265]: 2025-09-30 07:46:07.581 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:46:07 compute-0 nova_compute[189265]: 2025-09-30 07:46:07.598 2 DEBUG nova.compute.manager [req-fd370f0c-8a24-46f9-bb2b-4bbc18f643e2 req-be64a685-15b3-4ad1-8eb3-eb3e5868e9a1 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7e80d397-6a79-43e6-b663-fca5437ca2bb] Received event network-vif-deleted-62fc82a5-e7b3-4c47-be06-d9404d5372a7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:46:07 compute-0 nova_compute[189265]: 2025-09-30 07:46:07.599 2 INFO nova.compute.manager [req-fd370f0c-8a24-46f9-bb2b-4bbc18f643e2 req-be64a685-15b3-4ad1-8eb3-eb3e5868e9a1 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7e80d397-6a79-43e6-b663-fca5437ca2bb] Neutron deleted interface 62fc82a5-e7b3-4c47-be06-d9404d5372a7; detaching it from the instance and deleting it from the info cache
Sep 30 07:46:07 compute-0 nova_compute[189265]: 2025-09-30 07:46:07.599 2 DEBUG nova.network.neutron [req-fd370f0c-8a24-46f9-bb2b-4bbc18f643e2 req-be64a685-15b3-4ad1-8eb3-eb3e5868e9a1 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7e80d397-6a79-43e6-b663-fca5437ca2bb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:46:07 compute-0 nova_compute[189265]: 2025-09-30 07:46:07.623 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.042s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:46:07 compute-0 nova_compute[189265]: 2025-09-30 07:46:07.624 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5808MB free_disk=73.27460861206055GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:46:07 compute-0 nova_compute[189265]: 2025-09-30 07:46:07.625 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:46:07 compute-0 nova_compute[189265]: 2025-09-30 07:46:07.625 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:46:07 compute-0 nova_compute[189265]: 2025-09-30 07:46:07.774 2 DEBUG nova.compute.manager [req-270a92e1-2088-4ccd-9e69-e4900c5a34ad req-1e5c177e-641c-4930-a469-363a8070be42 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7e80d397-6a79-43e6-b663-fca5437ca2bb] Received event network-vif-unplugged-62fc82a5-e7b3-4c47-be06-d9404d5372a7 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:46:07 compute-0 nova_compute[189265]: 2025-09-30 07:46:07.775 2 DEBUG oslo_concurrency.lockutils [req-270a92e1-2088-4ccd-9e69-e4900c5a34ad req-1e5c177e-641c-4930-a469-363a8070be42 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "7e80d397-6a79-43e6-b663-fca5437ca2bb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:46:07 compute-0 nova_compute[189265]: 2025-09-30 07:46:07.775 2 DEBUG oslo_concurrency.lockutils [req-270a92e1-2088-4ccd-9e69-e4900c5a34ad req-1e5c177e-641c-4930-a469-363a8070be42 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "7e80d397-6a79-43e6-b663-fca5437ca2bb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:46:07 compute-0 nova_compute[189265]: 2025-09-30 07:46:07.775 2 DEBUG oslo_concurrency.lockutils [req-270a92e1-2088-4ccd-9e69-e4900c5a34ad req-1e5c177e-641c-4930-a469-363a8070be42 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "7e80d397-6a79-43e6-b663-fca5437ca2bb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:46:07 compute-0 nova_compute[189265]: 2025-09-30 07:46:07.775 2 DEBUG nova.compute.manager [req-270a92e1-2088-4ccd-9e69-e4900c5a34ad req-1e5c177e-641c-4930-a469-363a8070be42 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7e80d397-6a79-43e6-b663-fca5437ca2bb] No waiting events found dispatching network-vif-unplugged-62fc82a5-e7b3-4c47-be06-d9404d5372a7 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:46:07 compute-0 nova_compute[189265]: 2025-09-30 07:46:07.775 2 DEBUG nova.compute.manager [req-270a92e1-2088-4ccd-9e69-e4900c5a34ad req-1e5c177e-641c-4930-a469-363a8070be42 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7e80d397-6a79-43e6-b663-fca5437ca2bb] Received event network-vif-unplugged-62fc82a5-e7b3-4c47-be06-d9404d5372a7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:46:07 compute-0 sshd[124648]: drop connection #1 from [52.224.109.126]:56350 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:46:08 compute-0 nova_compute[189265]: 2025-09-30 07:46:08.026 2 DEBUG nova.network.neutron [-] [instance: 7e80d397-6a79-43e6-b663-fca5437ca2bb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:46:08 compute-0 nova_compute[189265]: 2025-09-30 07:46:08.112 2 DEBUG nova.compute.manager [req-fd370f0c-8a24-46f9-bb2b-4bbc18f643e2 req-be64a685-15b3-4ad1-8eb3-eb3e5868e9a1 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 7e80d397-6a79-43e6-b663-fca5437ca2bb] Detach interface failed, port_id=62fc82a5-e7b3-4c47-be06-d9404d5372a7, reason: Instance 7e80d397-6a79-43e6-b663-fca5437ca2bb could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Sep 30 07:46:08 compute-0 nova_compute[189265]: 2025-09-30 07:46:08.533 2 INFO nova.compute.manager [-] [instance: 7e80d397-6a79-43e6-b663-fca5437ca2bb] Took 1.66 seconds to deallocate network for instance.
Sep 30 07:46:08 compute-0 sshd[124648]: drop connection #1 from [52.224.109.126]:56360 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:46:09 compute-0 nova_compute[189265]: 2025-09-30 07:46:09.058 2 DEBUG oslo_concurrency.lockutils [None req-0d687846-ce8f-4520-b7ae-1a6613cbea39 bf911d50b77e4e20a250e642038c8043 dd45f15fbdba414c8d395e5ff149cbc4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:46:09 compute-0 nova_compute[189265]: 2025-09-30 07:46:09.327 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Instance 7e80d397-6a79-43e6-b663-fca5437ca2bb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 07:46:09 compute-0 nova_compute[189265]: 2025-09-30 07:46:09.328 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:46:09 compute-0 nova_compute[189265]: 2025-09-30 07:46:09.328 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:46:07 up  1:43,  0 user,  load average: 0.17, 0.31, 0.30\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_deleting': '1', 'num_os_type_None': '1', 'num_proj_dd45f15fbdba414c8d395e5ff149cbc4': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:46:09 compute-0 nova_compute[189265]: 2025-09-30 07:46:09.369 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:46:09 compute-0 sshd-session[225039]: Received disconnect from 193.46.255.159 port 53270:11:  [preauth]
Sep 30 07:46:09 compute-0 sshd-session[225039]: Disconnected from authenticating user root 193.46.255.159 port 53270 [preauth]
Sep 30 07:46:09 compute-0 sshd-session[225039]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=193.46.255.159  user=root
Sep 30 07:46:09 compute-0 podman[225154]: 2025-09-30 07:46:09.501562879 +0000 UTC m=+0.073388980 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vcs-type=git, architecture=x86_64, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., distribution-scope=public, name=ubi9-minimal)
Sep 30 07:46:09 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:56376 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:46:09 compute-0 nova_compute[189265]: 2025-09-30 07:46:09.781 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:46:09 compute-0 nova_compute[189265]: 2025-09-30 07:46:09.877 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:46:10 compute-0 nova_compute[189265]: 2025-09-30 07:46:10.387 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:46:10 compute-0 nova_compute[189265]: 2025-09-30 07:46:10.387 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.762s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:46:10 compute-0 nova_compute[189265]: 2025-09-30 07:46:10.387 2 DEBUG oslo_concurrency.lockutils [None req-0d687846-ce8f-4520-b7ae-1a6613cbea39 bf911d50b77e4e20a250e642038c8043 dd45f15fbdba414c8d395e5ff149cbc4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 1.329s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:46:10 compute-0 nova_compute[189265]: 2025-09-30 07:46:10.424 2 DEBUG nova.compute.provider_tree [None req-0d687846-ce8f-4520-b7ae-1a6613cbea39 bf911d50b77e4e20a250e642038c8043 dd45f15fbdba414c8d395e5ff149cbc4 - - default default] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:46:10 compute-0 sshd-session[225176]: Invalid user observer from 52.224.109.126 port 56380
Sep 30 07:46:10 compute-0 sshd-session[225176]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:46:10 compute-0 sshd-session[225176]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=52.224.109.126
Sep 30 07:46:10 compute-0 nova_compute[189265]: 2025-09-30 07:46:10.931 2 DEBUG nova.scheduler.client.report [None req-0d687846-ce8f-4520-b7ae-1a6613cbea39 bf911d50b77e4e20a250e642038c8043 dd45f15fbdba414c8d395e5ff149cbc4 - - default default] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:46:11 compute-0 nova_compute[189265]: 2025-09-30 07:46:11.387 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:46:11 compute-0 nova_compute[189265]: 2025-09-30 07:46:11.388 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:46:11 compute-0 nova_compute[189265]: 2025-09-30 07:46:11.431 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:46:11 compute-0 nova_compute[189265]: 2025-09-30 07:46:11.440 2 DEBUG oslo_concurrency.lockutils [None req-0d687846-ce8f-4520-b7ae-1a6613cbea39 bf911d50b77e4e20a250e642038c8043 dd45f15fbdba414c8d395e5ff149cbc4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.053s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:46:11 compute-0 nova_compute[189265]: 2025-09-30 07:46:11.462 2 INFO nova.scheduler.client.report [None req-0d687846-ce8f-4520-b7ae-1a6613cbea39 bf911d50b77e4e20a250e642038c8043 dd45f15fbdba414c8d395e5ff149cbc4 - - default default] Deleted allocations for instance 7e80d397-6a79-43e6-b663-fca5437ca2bb
Sep 30 07:46:11 compute-0 sshd-session[225179]: Invalid user docker from 52.224.109.126 port 56384
Sep 30 07:46:11 compute-0 sshd-session[225179]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:46:11 compute-0 sshd-session[225179]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=52.224.109.126
Sep 30 07:46:12 compute-0 sshd-session[225176]: Failed password for invalid user observer from 52.224.109.126 port 56380 ssh2
Sep 30 07:46:12 compute-0 nova_compute[189265]: 2025-09-30 07:46:12.494 2 DEBUG oslo_concurrency.lockutils [None req-0d687846-ce8f-4520-b7ae-1a6613cbea39 bf911d50b77e4e20a250e642038c8043 dd45f15fbdba414c8d395e5ff149cbc4 - - default default] Lock "7e80d397-6a79-43e6-b663-fca5437ca2bb" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.507s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:46:12 compute-0 sshd-session[225181]: Invalid user user from 52.224.109.126 port 56386
Sep 30 07:46:12 compute-0 sshd-session[225181]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:46:12 compute-0 sshd-session[225181]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=52.224.109.126
Sep 30 07:46:12 compute-0 podman[225183]: 2025-09-30 07:46:12.614183562 +0000 UTC m=+0.054072482 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20250930)
Sep 30 07:46:12 compute-0 podman[225184]: 2025-09-30 07:46:12.644715114 +0000 UTC m=+0.079186457 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 07:46:12 compute-0 podman[225185]: 2025-09-30 07:46:12.671174337 +0000 UTC m=+0.099347319 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_managed=true)
Sep 30 07:46:13 compute-0 sshd-session[225176]: Connection closed by invalid user observer 52.224.109.126 port 56380 [preauth]
Sep 30 07:46:13 compute-0 sshd-session[225245]: Invalid user elastic from 52.224.109.126 port 35854
Sep 30 07:46:13 compute-0 sshd-session[225245]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:46:13 compute-0 sshd-session[225245]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=52.224.109.126
Sep 30 07:46:13 compute-0 sshd-session[225179]: Failed password for invalid user docker from 52.224.109.126 port 56384 ssh2
Sep 30 07:46:13 compute-0 nova_compute[189265]: 2025-09-30 07:46:13.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:46:14 compute-0 sshd-session[225247]: Invalid user oracle from 52.224.109.126 port 35858
Sep 30 07:46:14 compute-0 sshd-session[225247]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:46:14 compute-0 sshd-session[225247]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=52.224.109.126
Sep 30 07:46:14 compute-0 sshd-session[225181]: Failed password for invalid user user from 52.224.109.126 port 56386 ssh2
Sep 30 07:46:14 compute-0 nova_compute[189265]: 2025-09-30 07:46:14.783 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:46:14 compute-0 nova_compute[189265]: 2025-09-30 07:46:14.812 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:46:14 compute-0 sshd-session[225181]: Connection closed by invalid user user 52.224.109.126 port 56386 [preauth]
Sep 30 07:46:15 compute-0 sshd-session[225179]: Connection closed by invalid user docker 52.224.109.126 port 56384 [preauth]
Sep 30 07:46:15 compute-0 sshd-session[225249]: Invalid user postgres from 52.224.109.126 port 35870
Sep 30 07:46:15 compute-0 sshd-session[225249]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:46:15 compute-0 sshd-session[225249]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=52.224.109.126
Sep 30 07:46:16 compute-0 sshd-session[225245]: Failed password for invalid user elastic from 52.224.109.126 port 35854 ssh2
Sep 30 07:46:16 compute-0 sshd-session[225247]: Failed password for invalid user oracle from 52.224.109.126 port 35858 ssh2
Sep 30 07:46:16 compute-0 sshd-session[225251]: Invalid user ts from 52.224.109.126 port 35886
Sep 30 07:46:16 compute-0 sshd-session[225251]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:46:16 compute-0 sshd-session[225251]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=52.224.109.126
Sep 30 07:46:16 compute-0 nova_compute[189265]: 2025-09-30 07:46:16.433 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:46:17 compute-0 unix_chkpwd[225255]: password check failed for user (root)
Sep 30 07:46:17 compute-0 sshd-session[225253]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=52.224.109.126  user=root
Sep 30 07:46:17 compute-0 sshd-session[225247]: Connection closed by invalid user oracle 52.224.109.126 port 35858 [preauth]
Sep 30 07:46:17 compute-0 sshd-session[225245]: Connection closed by invalid user elastic 52.224.109.126 port 35854 [preauth]
Sep 30 07:46:17 compute-0 sshd-session[225249]: Failed password for invalid user postgres from 52.224.109.126 port 35870 ssh2
Sep 30 07:46:18 compute-0 sshd[124648]: drop connection #3 from [52.224.109.126]:35906 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:46:18 compute-0 sshd-session[225251]: Failed password for invalid user ts from 52.224.109.126 port 35886 ssh2
Sep 30 07:46:18 compute-0 sshd-session[225249]: Connection closed by invalid user postgres 52.224.109.126 port 35870 [preauth]
Sep 30 07:46:19 compute-0 sshd[124648]: drop connection #2 from [52.224.109.126]:35914 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:46:19 compute-0 sshd-session[225253]: Failed password for root from 52.224.109.126 port 35890 ssh2
Sep 30 07:46:19 compute-0 nova_compute[189265]: 2025-09-30 07:46:19.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:46:19 compute-0 sshd[124648]: drop connection #2 from [52.224.109.126]:35924 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:46:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:46:20.593 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:46:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:46:20.593 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:46:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:46:20.594 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:46:20 compute-0 sshd-session[225251]: Connection closed by invalid user ts 52.224.109.126 port 35886 [preauth]
Sep 30 07:46:20 compute-0 sshd[124648]: drop connection #1 from [52.224.109.126]:35932 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:46:21 compute-0 sshd-session[225253]: Connection closed by authenticating user root 52.224.109.126 port 35890 [preauth]
Sep 30 07:46:21 compute-0 nova_compute[189265]: 2025-09-30 07:46:21.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:46:21 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:35934 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:46:22 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:35950 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:46:23 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:41868 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:46:24 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:41874 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:46:24 compute-0 nova_compute[189265]: 2025-09-30 07:46:24.855 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:46:25 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:41884 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:46:26 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:41898 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:46:26 compute-0 nova_compute[189265]: 2025-09-30 07:46:26.437 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:46:26 compute-0 podman[225257]: 2025-09-30 07:46:26.495013865 +0000 UTC m=+0.062295609 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 07:46:27 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:41906 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:46:28 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:41912 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:46:29 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:41928 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:46:29 compute-0 podman[199733]: time="2025-09-30T07:46:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:46:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:46:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:46:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:46:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3019 "" "Go-http-client/1.1"
Sep 30 07:46:29 compute-0 nova_compute[189265]: 2025-09-30 07:46:29.855 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:46:29 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:41930 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:46:30 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:41944 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:46:31 compute-0 openstack_network_exporter[201859]: ERROR   07:46:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:46:31 compute-0 openstack_network_exporter[201859]: ERROR   07:46:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:46:31 compute-0 openstack_network_exporter[201859]: ERROR   07:46:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:46:31 compute-0 openstack_network_exporter[201859]: ERROR   07:46:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:46:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:46:31 compute-0 openstack_network_exporter[201859]: ERROR   07:46:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:46:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:46:31 compute-0 nova_compute[189265]: 2025-09-30 07:46:31.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:46:31 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:41958 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:46:32 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:41970 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:46:33 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:53452 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:46:34 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:53464 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:46:34 compute-0 nova_compute[189265]: 2025-09-30 07:46:34.857 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:46:35 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:53480 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:46:36 compute-0 nova_compute[189265]: 2025-09-30 07:46:36.439 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:46:36 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:53490 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:46:36 compute-0 podman[225283]: 2025-09-30 07:46:36.502251317 +0000 UTC m=+0.082172243 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 07:46:37 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:53506 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:46:38 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:53514 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:46:39 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:53520 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:46:39 compute-0 nova_compute[189265]: 2025-09-30 07:46:39.860 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:46:40 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:53532 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:46:40 compute-0 podman[225303]: 2025-09-30 07:46:40.466483243 +0000 UTC m=+0.057890512 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, distribution-scope=public, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, release=1755695350, version=9.6, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., config_id=edpm, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64)
Sep 30 07:46:41 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:53544 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:46:41 compute-0 nova_compute[189265]: 2025-09-30 07:46:41.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:46:41 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:53552 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:46:42 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:53566 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:46:43 compute-0 podman[225324]: 2025-09-30 07:46:43.473546738 +0000 UTC m=+0.059106927 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=watcher_latest)
Sep 30 07:46:43 compute-0 podman[225325]: 2025-09-30 07:46:43.478288515 +0000 UTC m=+0.060566449 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4)
Sep 30 07:46:43 compute-0 podman[225326]: 2025-09-30 07:46:43.493412212 +0000 UTC m=+0.075789199 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller)
Sep 30 07:46:44 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:43908 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:46:44 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:43914 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:46:44 compute-0 nova_compute[189265]: 2025-09-30 07:46:44.862 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:46:45 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:43916 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:46:46 compute-0 nova_compute[189265]: 2025-09-30 07:46:46.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:46:46 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:43928 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:46:47 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:43944 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:46:48 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:43956 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:46:49 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:43968 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:46:49 compute-0 nova_compute[189265]: 2025-09-30 07:46:49.864 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:46:50 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:43982 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:46:51 compute-0 nova_compute[189265]: 2025-09-30 07:46:51.487 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:46:51 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:43990 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:46:52 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:44002 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:46:53 compute-0 sshd-session[225386]: Invalid user admin from 52.224.109.126 port 56124
Sep 30 07:46:53 compute-0 sshd-session[225386]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:46:53 compute-0 sshd-session[225386]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=52.224.109.126
Sep 30 07:46:54 compute-0 sshd-session[225388]: Invalid user default from 52.224.109.126 port 56130
Sep 30 07:46:54 compute-0 sshd-session[225388]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:46:54 compute-0 sshd-session[225388]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=52.224.109.126
Sep 30 07:46:54 compute-0 nova_compute[189265]: 2025-09-30 07:46:54.866 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:46:55 compute-0 sshd-session[225390]: Invalid user tomcat from 52.224.109.126 port 56140
Sep 30 07:46:55 compute-0 sshd-session[225386]: Failed password for invalid user admin from 52.224.109.126 port 56124 ssh2
Sep 30 07:46:55 compute-0 sshd-session[225390]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:46:55 compute-0 sshd-session[225390]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=52.224.109.126
Sep 30 07:46:56 compute-0 nova_compute[189265]: 2025-09-30 07:46:56.297 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:46:56 compute-0 nova_compute[189265]: 2025-09-30 07:46:56.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:46:56 compute-0 sshd-session[225392]: Invalid user gitlab from 52.224.109.126 port 56152
Sep 30 07:46:56 compute-0 sshd-session[225392]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:46:56 compute-0 sshd-session[225392]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=52.224.109.126
Sep 30 07:46:56 compute-0 nova_compute[189265]: 2025-09-30 07:46:56.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:46:56 compute-0 podman[225394]: 2025-09-30 07:46:56.808441921 +0000 UTC m=+0.067941021 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 07:46:57 compute-0 sshd-session[225388]: Failed password for invalid user default from 52.224.109.126 port 56130 ssh2
Sep 30 07:46:57 compute-0 sshd-session[225386]: Connection closed by invalid user admin 52.224.109.126 port 56124 [preauth]
Sep 30 07:46:57 compute-0 unix_chkpwd[225420]: password check failed for user (root)
Sep 30 07:46:57 compute-0 sshd-session[225418]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=52.224.109.126  user=root
Sep 30 07:46:57 compute-0 sshd-session[225390]: Failed password for invalid user tomcat from 52.224.109.126 port 56140 ssh2
Sep 30 07:46:57 compute-0 sshd-session[225392]: Failed password for invalid user gitlab from 52.224.109.126 port 56152 ssh2
Sep 30 07:46:57 compute-0 sshd-session[225392]: Connection closed by invalid user gitlab 52.224.109.126 port 56152 [preauth]
Sep 30 07:46:58 compute-0 sshd-session[225388]: Connection closed by invalid user default 52.224.109.126 port 56130 [preauth]
Sep 30 07:46:58 compute-0 sshd-session[225422]: Invalid user hadoop from 52.224.109.126 port 56168
Sep 30 07:46:58 compute-0 sshd-session[225422]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:46:58 compute-0 sshd-session[225422]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=52.224.109.126
Sep 30 07:46:58 compute-0 nova_compute[189265]: 2025-09-30 07:46:58.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:46:58 compute-0 sshd-session[225390]: Connection closed by invalid user tomcat 52.224.109.126 port 56140 [preauth]
Sep 30 07:46:58 compute-0 sshd-session[225418]: Failed password for root from 52.224.109.126 port 56154 ssh2
Sep 30 07:46:59 compute-0 sshd[124648]: drop connection #2 from [52.224.109.126]:56184 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:46:59 compute-0 sshd-session[225418]: Connection closed by authenticating user root 52.224.109.126 port 56154 [preauth]
Sep 30 07:46:59 compute-0 podman[199733]: time="2025-09-30T07:46:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:46:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:46:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:46:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:46:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3020 "" "Go-http-client/1.1"
Sep 30 07:46:59 compute-0 nova_compute[189265]: 2025-09-30 07:46:59.869 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:47:00 compute-0 sshd-session[225422]: Failed password for invalid user hadoop from 52.224.109.126 port 56168 ssh2
Sep 30 07:47:00 compute-0 sshd[124648]: drop connection #1 from [52.224.109.126]:56190 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:00 compute-0 sshd-session[225422]: Connection closed by invalid user hadoop 52.224.109.126 port 56168 [preauth]
Sep 30 07:47:01 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:56192 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:01 compute-0 openstack_network_exporter[201859]: ERROR   07:47:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:47:01 compute-0 openstack_network_exporter[201859]: ERROR   07:47:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:47:01 compute-0 openstack_network_exporter[201859]: ERROR   07:47:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:47:01 compute-0 openstack_network_exporter[201859]: ERROR   07:47:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:47:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:47:01 compute-0 openstack_network_exporter[201859]: ERROR   07:47:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:47:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:47:01 compute-0 nova_compute[189265]: 2025-09-30 07:47:01.490 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:47:02 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:56206 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:02 compute-0 nova_compute[189265]: 2025-09-30 07:47:02.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:47:02 compute-0 nova_compute[189265]: 2025-09-30 07:47:02.789 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:47:03 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:39292 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:04 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:39306 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:04 compute-0 nova_compute[189265]: 2025-09-30 07:47:04.870 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:47:05 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:39318 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:05 compute-0 nova_compute[189265]: 2025-09-30 07:47:05.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:47:06 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:39334 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:06 compute-0 nova_compute[189265]: 2025-09-30 07:47:06.311 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:47:06 compute-0 nova_compute[189265]: 2025-09-30 07:47:06.312 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:47:06 compute-0 nova_compute[189265]: 2025-09-30 07:47:06.312 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:47:06 compute-0 nova_compute[189265]: 2025-09-30 07:47:06.312 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:47:06 compute-0 nova_compute[189265]: 2025-09-30 07:47:06.459 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:47:06 compute-0 nova_compute[189265]: 2025-09-30 07:47:06.460 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:47:06 compute-0 nova_compute[189265]: 2025-09-30 07:47:06.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:47:06 compute-0 nova_compute[189265]: 2025-09-30 07:47:06.499 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:47:06 compute-0 nova_compute[189265]: 2025-09-30 07:47:06.499 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5831MB free_disk=73.30353546142578GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:47:06 compute-0 nova_compute[189265]: 2025-09-30 07:47:06.500 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:47:06 compute-0 nova_compute[189265]: 2025-09-30 07:47:06.500 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:47:07 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:39344 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:07 compute-0 podman[225425]: 2025-09-30 07:47:07.475313237 +0000 UTC m=+0.066704747 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, config_id=iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 07:47:07 compute-0 nova_compute[189265]: 2025-09-30 07:47:07.547 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:47:07 compute-0 nova_compute[189265]: 2025-09-30 07:47:07.547 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:47:06 up  1:44,  0 user,  load average: 0.16, 0.28, 0.29\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:47:07 compute-0 nova_compute[189265]: 2025-09-30 07:47:07.565 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:47:08 compute-0 nova_compute[189265]: 2025-09-30 07:47:08.070 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:47:08 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:39354 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:08 compute-0 nova_compute[189265]: 2025-09-30 07:47:08.580 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:47:08 compute-0 nova_compute[189265]: 2025-09-30 07:47:08.580 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.080s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:47:09 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:39356 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:09 compute-0 nova_compute[189265]: 2025-09-30 07:47:09.581 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:47:09 compute-0 nova_compute[189265]: 2025-09-30 07:47:09.582 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:47:09 compute-0 nova_compute[189265]: 2025-09-30 07:47:09.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:47:10 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:39370 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:10 compute-0 nova_compute[189265]: 2025-09-30 07:47:10.922 2 DEBUG nova.virt.libvirt.driver [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3096df7e-a792-4d1d-b3bb-e97838a49460] Creating tmpfile /var/lib/nova/instances/tmp06bg8hq9 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Sep 30 07:47:10 compute-0 nova_compute[189265]: 2025-09-30 07:47:10.924 2 WARNING neutronclient.v2_0.client [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:47:10 compute-0 nova_compute[189265]: 2025-09-30 07:47:10.927 2 DEBUG nova.compute.manager [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp06bg8hq9',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Sep 30 07:47:11 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:39380 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:11 compute-0 podman[225445]: 2025-09-30 07:47:11.48274863 +0000 UTC m=+0.071855915 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_id=edpm, version=9.6, architecture=x86_64, distribution-scope=public, managed_by=edpm_ansible, release=1755695350, name=ubi9-minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc.)
Sep 30 07:47:11 compute-0 nova_compute[189265]: 2025-09-30 07:47:11.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:47:12 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:39390 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:12 compute-0 nova_compute[189265]: 2025-09-30 07:47:12.963 2 WARNING neutronclient.v2_0.client [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:47:12 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:39404 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:13 compute-0 nova_compute[189265]: 2025-09-30 07:47:13.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:47:14 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:56480 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:14 compute-0 podman[225467]: 2025-09-30 07:47:14.470049276 +0000 UTC m=+0.053413533 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Sep 30 07:47:14 compute-0 podman[225466]: 2025-09-30 07:47:14.493111262 +0000 UTC m=+0.082006448 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 07:47:14 compute-0 podman[225468]: 2025-09-30 07:47:14.50621103 +0000 UTC m=+0.077323913 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=watcher_latest)
Sep 30 07:47:14 compute-0 nova_compute[189265]: 2025-09-30 07:47:14.916 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:47:14 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:56484 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:15 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:56494 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:16 compute-0 nova_compute[189265]: 2025-09-30 07:47:16.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:47:16 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:56508 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:17 compute-0 nova_compute[189265]: 2025-09-30 07:47:17.388 2 DEBUG nova.compute.manager [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp06bg8hq9',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='3096df7e-a792-4d1d-b3bb-e97838a49460',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Sep 30 07:47:17 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:56518 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:18 compute-0 nova_compute[189265]: 2025-09-30 07:47:18.401 2 DEBUG oslo_concurrency.lockutils [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "refresh_cache-3096df7e-a792-4d1d-b3bb-e97838a49460" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:47:18 compute-0 nova_compute[189265]: 2025-09-30 07:47:18.401 2 DEBUG oslo_concurrency.lockutils [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquired lock "refresh_cache-3096df7e-a792-4d1d-b3bb-e97838a49460" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:47:18 compute-0 nova_compute[189265]: 2025-09-30 07:47:18.401 2 DEBUG nova.network.neutron [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3096df7e-a792-4d1d-b3bb-e97838a49460] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 07:47:18 compute-0 nova_compute[189265]: 2025-09-30 07:47:18.908 2 WARNING neutronclient.v2_0.client [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:47:18 compute-0 sshd[124648]: drop connection #1 from [52.224.109.126]:56534 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:19 compute-0 sshd-session[225527]: Invalid user superadmin from 8.219.150.64 port 52010
Sep 30 07:47:19 compute-0 sshd-session[225527]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:47:19 compute-0 sshd-session[225527]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=8.219.150.64
Sep 30 07:47:19 compute-0 nova_compute[189265]: 2025-09-30 07:47:19.784 2 WARNING neutronclient.v2_0.client [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:47:19 compute-0 sshd[124648]: drop connection #1 from [52.224.109.126]:56540 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:19 compute-0 nova_compute[189265]: 2025-09-30 07:47:19.918 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:47:20 compute-0 nova_compute[189265]: 2025-09-30 07:47:20.398 2 DEBUG nova.network.neutron [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3096df7e-a792-4d1d-b3bb-e97838a49460] Updating instance_info_cache with network_info: [{"id": "cc096913-0e34-4900-9703-81fac95bfd92", "address": "fa:16:3e:48:af:61", "network": {"id": "b5627ebe-9328-432f-88fb-b5b539662efd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-337845770-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6633f775c5d46dc9c6c213b63954b2f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc096913-0e", "ovs_interfaceid": "cc096913-0e34-4900-9703-81fac95bfd92", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:47:20 compute-0 ovn_controller[91436]: 2025-09-30T07:47:20Z|00281|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Sep 30 07:47:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:20.594 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:47:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:20.594 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:47:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:20.595 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:47:20 compute-0 sshd[124648]: drop connection #1 from [52.224.109.126]:56552 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:20 compute-0 nova_compute[189265]: 2025-09-30 07:47:20.905 2 DEBUG oslo_concurrency.lockutils [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Releasing lock "refresh_cache-3096df7e-a792-4d1d-b3bb-e97838a49460" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:47:20 compute-0 nova_compute[189265]: 2025-09-30 07:47:20.925 2 DEBUG nova.virt.libvirt.driver [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3096df7e-a792-4d1d-b3bb-e97838a49460] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp06bg8hq9',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='3096df7e-a792-4d1d-b3bb-e97838a49460',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Sep 30 07:47:20 compute-0 nova_compute[189265]: 2025-09-30 07:47:20.926 2 DEBUG nova.virt.libvirt.driver [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3096df7e-a792-4d1d-b3bb-e97838a49460] Creating instance directory: /var/lib/nova/instances/3096df7e-a792-4d1d-b3bb-e97838a49460 pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Sep 30 07:47:20 compute-0 nova_compute[189265]: 2025-09-30 07:47:20.927 2 DEBUG nova.virt.libvirt.driver [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3096df7e-a792-4d1d-b3bb-e97838a49460] Creating disk.info with the contents: {'/var/lib/nova/instances/3096df7e-a792-4d1d-b3bb-e97838a49460/disk': 'qcow2', '/var/lib/nova/instances/3096df7e-a792-4d1d-b3bb-e97838a49460/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Sep 30 07:47:20 compute-0 nova_compute[189265]: 2025-09-30 07:47:20.928 2 DEBUG nova.virt.libvirt.driver [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3096df7e-a792-4d1d-b3bb-e97838a49460] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Sep 30 07:47:20 compute-0 nova_compute[189265]: 2025-09-30 07:47:20.928 2 DEBUG nova.objects.instance [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lazy-loading 'trusted_certs' on Instance uuid 3096df7e-a792-4d1d-b3bb-e97838a49460 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:47:21 compute-0 sshd-session[225527]: Failed password for invalid user superadmin from 8.219.150.64 port 52010 ssh2
Sep 30 07:47:21 compute-0 nova_compute[189265]: 2025-09-30 07:47:21.438 2 DEBUG oslo_utils.imageutils.format_inspector [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:47:21 compute-0 nova_compute[189265]: 2025-09-30 07:47:21.442 2 DEBUG oslo_utils.imageutils.format_inspector [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:47:21 compute-0 nova_compute[189265]: 2025-09-30 07:47:21.443 2 DEBUG oslo_concurrency.processutils [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:47:21 compute-0 nova_compute[189265]: 2025-09-30 07:47:21.492 2 DEBUG oslo_concurrency.processutils [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.049s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:47:21 compute-0 nova_compute[189265]: 2025-09-30 07:47:21.493 2 DEBUG oslo_concurrency.lockutils [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "649c128805005f3dfb5a93843c58a367cdfe939d" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:47:21 compute-0 nova_compute[189265]: 2025-09-30 07:47:21.494 2 DEBUG oslo_concurrency.lockutils [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "649c128805005f3dfb5a93843c58a367cdfe939d" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:47:21 compute-0 nova_compute[189265]: 2025-09-30 07:47:21.494 2 DEBUG oslo_utils.imageutils.format_inspector [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:47:21 compute-0 nova_compute[189265]: 2025-09-30 07:47:21.497 2 DEBUG oslo_utils.imageutils.format_inspector [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:47:21 compute-0 nova_compute[189265]: 2025-09-30 07:47:21.497 2 DEBUG oslo_concurrency.processutils [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:47:21 compute-0 nova_compute[189265]: 2025-09-30 07:47:21.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:47:21 compute-0 nova_compute[189265]: 2025-09-30 07:47:21.583 2 DEBUG oslo_concurrency.processutils [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:47:21 compute-0 nova_compute[189265]: 2025-09-30 07:47:21.583 2 DEBUG oslo_concurrency.processutils [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d,backing_fmt=raw /var/lib/nova/instances/3096df7e-a792-4d1d-b3bb-e97838a49460/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:47:21 compute-0 nova_compute[189265]: 2025-09-30 07:47:21.629 2 DEBUG oslo_concurrency.processutils [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d,backing_fmt=raw /var/lib/nova/instances/3096df7e-a792-4d1d-b3bb-e97838a49460/disk 1073741824" returned: 0 in 0.046s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:47:21 compute-0 nova_compute[189265]: 2025-09-30 07:47:21.630 2 DEBUG oslo_concurrency.lockutils [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "649c128805005f3dfb5a93843c58a367cdfe939d" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.136s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:47:21 compute-0 nova_compute[189265]: 2025-09-30 07:47:21.630 2 DEBUG oslo_concurrency.processutils [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:47:21 compute-0 sshd[124648]: drop connection #2 from [52.224.109.126]:56558 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:21 compute-0 sshd-session[225530]: Invalid user test from 159.89.22.242 port 32920
Sep 30 07:47:21 compute-0 sshd-session[225530]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:47:21 compute-0 sshd-session[225530]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=159.89.22.242
Sep 30 07:47:21 compute-0 nova_compute[189265]: 2025-09-30 07:47:21.705 2 DEBUG oslo_concurrency.processutils [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:47:21 compute-0 nova_compute[189265]: 2025-09-30 07:47:21.706 2 DEBUG nova.virt.disk.api [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Checking if we can resize image /var/lib/nova/instances/3096df7e-a792-4d1d-b3bb-e97838a49460/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Sep 30 07:47:21 compute-0 nova_compute[189265]: 2025-09-30 07:47:21.706 2 DEBUG oslo_concurrency.processutils [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3096df7e-a792-4d1d-b3bb-e97838a49460/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:47:21 compute-0 sshd-session[225527]: Received disconnect from 8.219.150.64 port 52010:11: Bye Bye [preauth]
Sep 30 07:47:21 compute-0 sshd-session[225527]: Disconnected from invalid user superadmin 8.219.150.64 port 52010 [preauth]
Sep 30 07:47:21 compute-0 nova_compute[189265]: 2025-09-30 07:47:21.767 2 DEBUG oslo_concurrency.processutils [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3096df7e-a792-4d1d-b3bb-e97838a49460/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:47:21 compute-0 nova_compute[189265]: 2025-09-30 07:47:21.768 2 DEBUG nova.virt.disk.api [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Cannot resize image /var/lib/nova/instances/3096df7e-a792-4d1d-b3bb-e97838a49460/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Sep 30 07:47:21 compute-0 nova_compute[189265]: 2025-09-30 07:47:21.768 2 DEBUG nova.objects.instance [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lazy-loading 'migration_context' on Instance uuid 3096df7e-a792-4d1d-b3bb-e97838a49460 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:47:22 compute-0 nova_compute[189265]: 2025-09-30 07:47:22.278 2 DEBUG nova.objects.base [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Object Instance<3096df7e-a792-4d1d-b3bb-e97838a49460> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Sep 30 07:47:22 compute-0 nova_compute[189265]: 2025-09-30 07:47:22.279 2 DEBUG oslo_concurrency.processutils [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/3096df7e-a792-4d1d-b3bb-e97838a49460/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:47:22 compute-0 nova_compute[189265]: 2025-09-30 07:47:22.299 2 DEBUG oslo_concurrency.processutils [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/3096df7e-a792-4d1d-b3bb-e97838a49460/disk.config 497664" returned: 0 in 0.020s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:47:22 compute-0 nova_compute[189265]: 2025-09-30 07:47:22.300 2 DEBUG nova.virt.libvirt.driver [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3096df7e-a792-4d1d-b3bb-e97838a49460] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Sep 30 07:47:22 compute-0 nova_compute[189265]: 2025-09-30 07:47:22.302 2 DEBUG nova.virt.libvirt.vif [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T07:46:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-117756898',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(3),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-117756898',id=30,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T07:46:33Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1151,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='dd45f15fbdba414c8d395e5ff149cbc4',ramdisk_id='',reservation_id='r-b0y6rjar',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-359850667',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-359850667-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T07:46:33Z,user_data=None,user_id='bf911d50b77e4e20a250e642038c8043',uuid=3096df7e-a792-4d1d-b3bb-e97838a49460,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cc096913-0e34-4900-9703-81fac95bfd92", "address": "fa:16:3e:48:af:61", "network": {"id": "b5627ebe-9328-432f-88fb-b5b539662efd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-337845770-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6633f775c5d46dc9c6c213b63954b2f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapcc096913-0e", "ovs_interfaceid": "cc096913-0e34-4900-9703-81fac95bfd92", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 07:47:22 compute-0 nova_compute[189265]: 2025-09-30 07:47:22.302 2 DEBUG nova.network.os_vif_util [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Converting VIF {"id": "cc096913-0e34-4900-9703-81fac95bfd92", "address": "fa:16:3e:48:af:61", "network": {"id": "b5627ebe-9328-432f-88fb-b5b539662efd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-337845770-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6633f775c5d46dc9c6c213b63954b2f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapcc096913-0e", "ovs_interfaceid": "cc096913-0e34-4900-9703-81fac95bfd92", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:47:22 compute-0 nova_compute[189265]: 2025-09-30 07:47:22.303 2 DEBUG nova.network.os_vif_util [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:48:af:61,bridge_name='br-int',has_traffic_filtering=True,id=cc096913-0e34-4900-9703-81fac95bfd92,network=Network(b5627ebe-9328-432f-88fb-b5b539662efd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc096913-0e') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:47:22 compute-0 nova_compute[189265]: 2025-09-30 07:47:22.303 2 DEBUG os_vif [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:48:af:61,bridge_name='br-int',has_traffic_filtering=True,id=cc096913-0e34-4900-9703-81fac95bfd92,network=Network(b5627ebe-9328-432f-88fb-b5b539662efd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc096913-0e') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 07:47:22 compute-0 nova_compute[189265]: 2025-09-30 07:47:22.304 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:47:22 compute-0 nova_compute[189265]: 2025-09-30 07:47:22.304 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:47:22 compute-0 nova_compute[189265]: 2025-09-30 07:47:22.304 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:47:22 compute-0 nova_compute[189265]: 2025-09-30 07:47:22.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:47:22 compute-0 nova_compute[189265]: 2025-09-30 07:47:22.305 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'c5486814-0c3c-5312-9e94-c04237b3a77f', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:47:22 compute-0 nova_compute[189265]: 2025-09-30 07:47:22.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:47:22 compute-0 nova_compute[189265]: 2025-09-30 07:47:22.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:47:22 compute-0 nova_compute[189265]: 2025-09-30 07:47:22.310 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:47:22 compute-0 nova_compute[189265]: 2025-09-30 07:47:22.310 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcc096913-0e, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:47:22 compute-0 nova_compute[189265]: 2025-09-30 07:47:22.310 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tapcc096913-0e, col_values=(('qos', UUID('59b27f01-b417-4fc3-bd9c-529adc037d3a')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:47:22 compute-0 nova_compute[189265]: 2025-09-30 07:47:22.311 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tapcc096913-0e, col_values=(('external_ids', {'iface-id': 'cc096913-0e34-4900-9703-81fac95bfd92', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:48:af:61', 'vm-uuid': '3096df7e-a792-4d1d-b3bb-e97838a49460'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:47:22 compute-0 nova_compute[189265]: 2025-09-30 07:47:22.311 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:47:22 compute-0 NetworkManager[51813]: <info>  [1759218442.3127] manager: (tapcc096913-0e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/97)
Sep 30 07:47:22 compute-0 nova_compute[189265]: 2025-09-30 07:47:22.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:47:22 compute-0 nova_compute[189265]: 2025-09-30 07:47:22.317 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:47:22 compute-0 nova_compute[189265]: 2025-09-30 07:47:22.317 2 INFO os_vif [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:48:af:61,bridge_name='br-int',has_traffic_filtering=True,id=cc096913-0e34-4900-9703-81fac95bfd92,network=Network(b5627ebe-9328-432f-88fb-b5b539662efd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc096913-0e')
Sep 30 07:47:22 compute-0 nova_compute[189265]: 2025-09-30 07:47:22.318 2 DEBUG nova.virt.libvirt.driver [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Sep 30 07:47:22 compute-0 nova_compute[189265]: 2025-09-30 07:47:22.318 2 DEBUG nova.compute.manager [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp06bg8hq9',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='3096df7e-a792-4d1d-b3bb-e97838a49460',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Sep 30 07:47:22 compute-0 nova_compute[189265]: 2025-09-30 07:47:22.319 2 WARNING neutronclient.v2_0.client [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:47:22 compute-0 nova_compute[189265]: 2025-09-30 07:47:22.399 2 WARNING neutronclient.v2_0.client [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:47:22 compute-0 sshd[124648]: drop connection #1 from [52.224.109.126]:56570 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:22 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:22.683 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=33, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '1a:26:7c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2e:60:fa:91:d0:34'}, ipsec=False) old=SB_Global(nb_cfg=32) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:47:22 compute-0 nova_compute[189265]: 2025-09-30 07:47:22.684 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:47:22 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:22.685 100322 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 07:47:23 compute-0 sshd-session[225530]: Failed password for invalid user test from 159.89.22.242 port 32920 ssh2
Sep 30 07:47:23 compute-0 nova_compute[189265]: 2025-09-30 07:47:23.430 2 DEBUG nova.network.neutron [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3096df7e-a792-4d1d-b3bb-e97838a49460] Port cc096913-0e34-4900-9703-81fac95bfd92 updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Sep 30 07:47:23 compute-0 nova_compute[189265]: 2025-09-30 07:47:23.451 2 DEBUG nova.compute.manager [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=74752,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp06bg8hq9',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='3096df7e-a792-4d1d-b3bb-e97838a49460',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Sep 30 07:47:23 compute-0 sshd[124648]: drop connection #1 from [52.224.109.126]:57264 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:24 compute-0 sshd[124648]: drop connection #1 from [52.224.109.126]:57268 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:24 compute-0 sshd-session[225530]: Received disconnect from 159.89.22.242 port 32920:11: Bye Bye [preauth]
Sep 30 07:47:24 compute-0 sshd-session[225530]: Disconnected from invalid user test 159.89.22.242 port 32920 [preauth]
Sep 30 07:47:24 compute-0 nova_compute[189265]: 2025-09-30 07:47:24.940 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:47:25 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:57278 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:26 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:57288 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:26 compute-0 kernel: tapcc096913-0e: entered promiscuous mode
Sep 30 07:47:26 compute-0 NetworkManager[51813]: <info>  [1759218446.5522] manager: (tapcc096913-0e): new Tun device (/org/freedesktop/NetworkManager/Devices/98)
Sep 30 07:47:26 compute-0 nova_compute[189265]: 2025-09-30 07:47:26.554 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:47:26 compute-0 ovn_controller[91436]: 2025-09-30T07:47:26Z|00282|binding|INFO|Claiming lport cc096913-0e34-4900-9703-81fac95bfd92 for this additional chassis.
Sep 30 07:47:26 compute-0 ovn_controller[91436]: 2025-09-30T07:47:26Z|00283|binding|INFO|cc096913-0e34-4900-9703-81fac95bfd92: Claiming fa:16:3e:48:af:61 10.100.0.3
Sep 30 07:47:26 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:26.564 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:48:af:61 10.100.0.3'], port_security=['fa:16:3e:48:af:61 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '3096df7e-a792-4d1d-b3bb-e97838a49460', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b5627ebe-9328-432f-88fb-b5b539662efd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dd45f15fbdba414c8d395e5ff149cbc4', 'neutron:revision_number': '10', 'neutron:security_group_ids': '3b4f5d53-fbd5-497e-8555-6b61b9a4d332', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d4956bab-94e0-436a-b508-eeb3061671e6, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=cc096913-0e34-4900-9703-81fac95bfd92) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:47:26 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:26.566 100322 INFO neutron.agent.ovn.metadata.agent [-] Port cc096913-0e34-4900-9703-81fac95bfd92 in datapath b5627ebe-9328-432f-88fb-b5b539662efd unbound from our chassis
Sep 30 07:47:26 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:26.567 100322 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b5627ebe-9328-432f-88fb-b5b539662efd
Sep 30 07:47:26 compute-0 ovn_controller[91436]: 2025-09-30T07:47:26Z|00284|binding|INFO|Setting lport cc096913-0e34-4900-9703-81fac95bfd92 ovn-installed in OVS
Sep 30 07:47:26 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:26.585 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[180cae5a-8ff3-474f-af79-dcde5a479f0a]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:47:26 compute-0 nova_compute[189265]: 2025-09-30 07:47:26.586 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:47:26 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:26.586 100322 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb5627ebe-91 in ovnmeta-b5627ebe-9328-432f-88fb-b5b539662efd namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Sep 30 07:47:26 compute-0 nova_compute[189265]: 2025-09-30 07:47:26.588 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:47:26 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:26.588 210650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb5627ebe-90 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Sep 30 07:47:26 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:26.589 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[7f22fb5c-4b8e-4d81-b901-9d87a0c4e4ae]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:47:26 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:26.591 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[e3c6d172-a1d7-4543-8095-0d7a52a8c700]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:47:26 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:26.612 100440 DEBUG oslo.privsep.daemon [-] privsep: reply[6d77931e-0b6c-4b36-96ac-c18d462d70a2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:47:26 compute-0 systemd-machined[149233]: New machine qemu-24-instance-0000001e.
Sep 30 07:47:26 compute-0 systemd[1]: Started Virtual Machine qemu-24-instance-0000001e.
Sep 30 07:47:26 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:26.636 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[d31b4cd0-f0a1-4734-961e-d10015e4a958]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:47:26 compute-0 systemd-udevd[225572]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 07:47:26 compute-0 NetworkManager[51813]: <info>  [1759218446.6780] device (tapcc096913-0e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 07:47:26 compute-0 NetworkManager[51813]: <info>  [1759218446.6792] device (tapcc096913-0e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 07:47:26 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:26.679 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[fc1e41f3-a880-4ec2-afac-9390efb04395]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:47:26 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:26.685 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[d9a2e9b6-7d5d-46d8-8f80-d21ebeb3b0ba]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:47:26 compute-0 NetworkManager[51813]: <info>  [1759218446.6864] manager: (tapb5627ebe-90): new Veth device (/org/freedesktop/NetworkManager/Devices/99)
Sep 30 07:47:26 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:26.727 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[a61a695d-fe16-40ac-a0c2-2b34ed4203e3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:47:26 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:26.731 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[9c16476b-5448-4258-857c-476476ebb9cc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:47:26 compute-0 NetworkManager[51813]: <info>  [1759218446.7643] device (tapb5627ebe-90): carrier: link connected
Sep 30 07:47:26 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:26.773 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[46885b95-9abf-467a-922c-cbb065704e98]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:47:26 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:26.800 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[bb309d6d-912d-43ba-b13d-ec2dd495a755]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb5627ebe-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:04:a5:0f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 72], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 630448, 'reachable_time': 18968, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225601, 'error': None, 'target': 'ovnmeta-b5627ebe-9328-432f-88fb-b5b539662efd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:47:26 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:26.829 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[cd064d15-ece9-486d-a422-d94947577361]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe04:a50f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 630448, 'tstamp': 630448}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225602, 'error': None, 'target': 'ovnmeta-b5627ebe-9328-432f-88fb-b5b539662efd', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:47:26 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:26.856 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[6517e5e7-f69a-49cb-a8d1-958177a69cb1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb5627ebe-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:04:a5:0f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 72], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 630448, 'reachable_time': 18968, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225603, 'error': None, 'target': 'ovnmeta-b5627ebe-9328-432f-88fb-b5b539662efd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:47:26 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:26.910 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[6cb52e02-bc54-4416-94fe-153c30e160f7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:47:27 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:27.004 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[443686ef-0ea0-494a-9f2e-f390d39e4d15]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:47:27 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:27.005 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5627ebe-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:47:27 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:27.006 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:47:27 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:27.006 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb5627ebe-90, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:47:27 compute-0 NetworkManager[51813]: <info>  [1759218447.0098] manager: (tapb5627ebe-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/100)
Sep 30 07:47:27 compute-0 kernel: tapb5627ebe-90: entered promiscuous mode
Sep 30 07:47:27 compute-0 nova_compute[189265]: 2025-09-30 07:47:27.010 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:47:27 compute-0 nova_compute[189265]: 2025-09-30 07:47:27.012 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:47:27 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:27.014 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb5627ebe-90, col_values=(('external_ids', {'iface-id': '694a3f4b-a908-40fb-abbe-4687074d8093'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:47:27 compute-0 nova_compute[189265]: 2025-09-30 07:47:27.015 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:47:27 compute-0 ovn_controller[91436]: 2025-09-30T07:47:27Z|00285|binding|INFO|Releasing lport 694a3f4b-a908-40fb-abbe-4687074d8093 from this chassis (sb_readonly=0)
Sep 30 07:47:27 compute-0 nova_compute[189265]: 2025-09-30 07:47:27.019 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:47:27 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:27.021 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[faebb946-3407-4316-8335-99812b41e288]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:47:27 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:27.022 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b5627ebe-9328-432f-88fb-b5b539662efd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b5627ebe-9328-432f-88fb-b5b539662efd.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:47:27 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:27.023 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b5627ebe-9328-432f-88fb-b5b539662efd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b5627ebe-9328-432f-88fb-b5b539662efd.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:47:27 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:27.023 100322 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for b5627ebe-9328-432f-88fb-b5b539662efd disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Sep 30 07:47:27 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:27.023 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b5627ebe-9328-432f-88fb-b5b539662efd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b5627ebe-9328-432f-88fb-b5b539662efd.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:47:27 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:27.024 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[0242e5e6-4ea9-4a46-ba00-d08d259c54c5]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:47:27 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:27.024 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b5627ebe-9328-432f-88fb-b5b539662efd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b5627ebe-9328-432f-88fb-b5b539662efd.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:47:27 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:27.025 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[a499aaee-5185-457e-8a5b-346ff220d81e]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:47:27 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:27.026 100322 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Sep 30 07:47:27 compute-0 ovn_metadata_agent[100317]: global
Sep 30 07:47:27 compute-0 ovn_metadata_agent[100317]:     log         /dev/log local0 debug
Sep 30 07:47:27 compute-0 ovn_metadata_agent[100317]:     log-tag     haproxy-metadata-proxy-b5627ebe-9328-432f-88fb-b5b539662efd
Sep 30 07:47:27 compute-0 ovn_metadata_agent[100317]:     user        root
Sep 30 07:47:27 compute-0 ovn_metadata_agent[100317]:     group       root
Sep 30 07:47:27 compute-0 ovn_metadata_agent[100317]:     maxconn     1024
Sep 30 07:47:27 compute-0 ovn_metadata_agent[100317]:     pidfile     /var/lib/neutron/external/pids/b5627ebe-9328-432f-88fb-b5b539662efd.pid.haproxy
Sep 30 07:47:27 compute-0 ovn_metadata_agent[100317]:     daemon
Sep 30 07:47:27 compute-0 ovn_metadata_agent[100317]: 
Sep 30 07:47:27 compute-0 ovn_metadata_agent[100317]: defaults
Sep 30 07:47:27 compute-0 ovn_metadata_agent[100317]:     log global
Sep 30 07:47:27 compute-0 ovn_metadata_agent[100317]:     mode http
Sep 30 07:47:27 compute-0 ovn_metadata_agent[100317]:     option httplog
Sep 30 07:47:27 compute-0 ovn_metadata_agent[100317]:     option dontlognull
Sep 30 07:47:27 compute-0 ovn_metadata_agent[100317]:     option http-server-close
Sep 30 07:47:27 compute-0 ovn_metadata_agent[100317]:     option forwardfor
Sep 30 07:47:27 compute-0 ovn_metadata_agent[100317]:     retries                 3
Sep 30 07:47:27 compute-0 ovn_metadata_agent[100317]:     timeout http-request    30s
Sep 30 07:47:27 compute-0 ovn_metadata_agent[100317]:     timeout connect         30s
Sep 30 07:47:27 compute-0 ovn_metadata_agent[100317]:     timeout client          32s
Sep 30 07:47:27 compute-0 ovn_metadata_agent[100317]:     timeout server          32s
Sep 30 07:47:27 compute-0 ovn_metadata_agent[100317]:     timeout http-keep-alive 30s
Sep 30 07:47:27 compute-0 ovn_metadata_agent[100317]: 
Sep 30 07:47:27 compute-0 ovn_metadata_agent[100317]: listen listener
Sep 30 07:47:27 compute-0 ovn_metadata_agent[100317]:     bind 169.254.169.254:80
Sep 30 07:47:27 compute-0 ovn_metadata_agent[100317]:     
Sep 30 07:47:27 compute-0 ovn_metadata_agent[100317]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 07:47:27 compute-0 ovn_metadata_agent[100317]: 
Sep 30 07:47:27 compute-0 ovn_metadata_agent[100317]:     http-request add-header X-OVN-Network-ID b5627ebe-9328-432f-88fb-b5b539662efd
Sep 30 07:47:27 compute-0 ovn_metadata_agent[100317]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Sep 30 07:47:27 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:27.026 100322 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b5627ebe-9328-432f-88fb-b5b539662efd', 'env', 'PROCESS_TAG=haproxy-b5627ebe-9328-432f-88fb-b5b539662efd', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b5627ebe-9328-432f-88fb-b5b539662efd.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Sep 30 07:47:27 compute-0 nova_compute[189265]: 2025-09-30 07:47:27.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:47:27 compute-0 nova_compute[189265]: 2025-09-30 07:47:27.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:47:27 compute-0 podman[225635]: 2025-09-30 07:47:27.50497859 +0000 UTC m=+0.084675895 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 07:47:27 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:57298 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:27 compute-0 podman[225641]: 2025-09-30 07:47:27.52889422 +0000 UTC m=+0.083788239 container create b0289028dae91514e613445149a02af8fd8a740e4efe5ba963046d9741d69b66 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-b5627ebe-9328-432f-88fb-b5b539662efd, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930)
Sep 30 07:47:27 compute-0 systemd[1]: Started libpod-conmon-b0289028dae91514e613445149a02af8fd8a740e4efe5ba963046d9741d69b66.scope.
Sep 30 07:47:27 compute-0 podman[225641]: 2025-09-30 07:47:27.488572656 +0000 UTC m=+0.043466765 image pull eeebcc09bc72f81ab45f5ab87eb8f6a7b554b949227aeec082bdb0732754ddc8 38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 07:47:27 compute-0 systemd[1]: Started libcrun container.
Sep 30 07:47:27 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d311a95a22c6d94b875e59cec054dc63c39b6db02a7f01b8ec48673d071aae2f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 07:47:27 compute-0 podman[225641]: 2025-09-30 07:47:27.629709861 +0000 UTC m=+0.184603930 container init b0289028dae91514e613445149a02af8fd8a740e4efe5ba963046d9741d69b66 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-b5627ebe-9328-432f-88fb-b5b539662efd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, tcib_build_tag=watcher_latest, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930)
Sep 30 07:47:27 compute-0 podman[225641]: 2025-09-30 07:47:27.641673696 +0000 UTC m=+0.196567755 container start b0289028dae91514e613445149a02af8fd8a740e4efe5ba963046d9741d69b66 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-b5627ebe-9328-432f-88fb-b5b539662efd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS)
Sep 30 07:47:27 compute-0 neutron-haproxy-ovnmeta-b5627ebe-9328-432f-88fb-b5b539662efd[225680]: [NOTICE]   (225684) : New worker (225686) forked
Sep 30 07:47:27 compute-0 neutron-haproxy-ovnmeta-b5627ebe-9328-432f-88fb-b5b539662efd[225680]: [NOTICE]   (225684) : Loading success.
Sep 30 07:47:28 compute-0 sshd-session[225707]: Invalid user dev from 52.224.109.126 port 57314
Sep 30 07:47:28 compute-0 sshd-session[225707]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:47:28 compute-0 sshd-session[225707]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=52.224.109.126
Sep 30 07:47:29 compute-0 sshd-session[225709]: Invalid user guest from 52.224.109.126 port 57326
Sep 30 07:47:29 compute-0 sshd-session[225709]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:47:29 compute-0 sshd-session[225709]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=52.224.109.126
Sep 30 07:47:29 compute-0 ovn_controller[91436]: 2025-09-30T07:47:29Z|00286|binding|INFO|Claiming lport cc096913-0e34-4900-9703-81fac95bfd92 for this chassis.
Sep 30 07:47:29 compute-0 ovn_controller[91436]: 2025-09-30T07:47:29Z|00287|binding|INFO|cc096913-0e34-4900-9703-81fac95bfd92: Claiming fa:16:3e:48:af:61 10.100.0.3
Sep 30 07:47:29 compute-0 ovn_controller[91436]: 2025-09-30T07:47:29Z|00288|binding|INFO|Setting lport cc096913-0e34-4900-9703-81fac95bfd92 up in Southbound
Sep 30 07:47:29 compute-0 podman[199733]: time="2025-09-30T07:47:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:47:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:47:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 07:47:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:47:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3475 "" "Go-http-client/1.1"
Sep 30 07:47:29 compute-0 nova_compute[189265]: 2025-09-30 07:47:29.942 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:47:30 compute-0 sshd-session[225707]: Failed password for invalid user dev from 52.224.109.126 port 57314 ssh2
Sep 30 07:47:30 compute-0 sshd-session[225711]: Invalid user tomcat from 52.224.109.126 port 57330
Sep 30 07:47:30 compute-0 sshd-session[225711]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:47:30 compute-0 sshd-session[225711]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=52.224.109.126
Sep 30 07:47:30 compute-0 nova_compute[189265]: 2025-09-30 07:47:30.720 2 INFO nova.compute.manager [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3096df7e-a792-4d1d-b3bb-e97838a49460] Post operation of migration started
Sep 30 07:47:30 compute-0 nova_compute[189265]: 2025-09-30 07:47:30.721 2 WARNING neutronclient.v2_0.client [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:47:30 compute-0 nova_compute[189265]: 2025-09-30 07:47:30.828 2 WARNING neutronclient.v2_0.client [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:47:30 compute-0 nova_compute[189265]: 2025-09-30 07:47:30.830 2 WARNING neutronclient.v2_0.client [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:47:31 compute-0 nova_compute[189265]: 2025-09-30 07:47:31.070 2 DEBUG oslo_concurrency.lockutils [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "refresh_cache-3096df7e-a792-4d1d-b3bb-e97838a49460" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:47:31 compute-0 nova_compute[189265]: 2025-09-30 07:47:31.071 2 DEBUG oslo_concurrency.lockutils [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquired lock "refresh_cache-3096df7e-a792-4d1d-b3bb-e97838a49460" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:47:31 compute-0 nova_compute[189265]: 2025-09-30 07:47:31.072 2 DEBUG nova.network.neutron [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3096df7e-a792-4d1d-b3bb-e97838a49460] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 07:47:31 compute-0 sshd-session[225707]: Connection closed by invalid user dev 52.224.109.126 port 57314 [preauth]
Sep 30 07:47:31 compute-0 openstack_network_exporter[201859]: ERROR   07:47:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:47:31 compute-0 openstack_network_exporter[201859]: ERROR   07:47:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:47:31 compute-0 openstack_network_exporter[201859]: ERROR   07:47:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:47:31 compute-0 openstack_network_exporter[201859]: ERROR   07:47:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:47:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:47:31 compute-0 openstack_network_exporter[201859]: ERROR   07:47:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:47:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:47:31 compute-0 sshd-session[225713]: Invalid user elsearch from 52.224.109.126 port 57344
Sep 30 07:47:31 compute-0 sshd-session[225709]: Failed password for invalid user guest from 52.224.109.126 port 57326 ssh2
Sep 30 07:47:31 compute-0 sshd-session[225713]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:47:31 compute-0 sshd-session[225713]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=52.224.109.126
Sep 30 07:47:31 compute-0 nova_compute[189265]: 2025-09-30 07:47:31.579 2 WARNING neutronclient.v2_0.client [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:47:31 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:31.687 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=01429670-4ea1-4dab-babc-4bc628cc01bb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '33'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:47:32 compute-0 nova_compute[189265]: 2025-09-30 07:47:32.060 2 WARNING neutronclient.v2_0.client [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:47:32 compute-0 sshd-session[225711]: Failed password for invalid user tomcat from 52.224.109.126 port 57330 ssh2
Sep 30 07:47:32 compute-0 nova_compute[189265]: 2025-09-30 07:47:32.304 2 DEBUG nova.network.neutron [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3096df7e-a792-4d1d-b3bb-e97838a49460] Updating instance_info_cache with network_info: [{"id": "cc096913-0e34-4900-9703-81fac95bfd92", "address": "fa:16:3e:48:af:61", "network": {"id": "b5627ebe-9328-432f-88fb-b5b539662efd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-337845770-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6633f775c5d46dc9c6c213b63954b2f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc096913-0e", "ovs_interfaceid": "cc096913-0e34-4900-9703-81fac95bfd92", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:47:32 compute-0 sshd-session[225709]: Connection closed by invalid user guest 52.224.109.126 port 57326 [preauth]
Sep 30 07:47:32 compute-0 nova_compute[189265]: 2025-09-30 07:47:32.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:47:32 compute-0 sshd-session[225715]: Invalid user git from 52.224.109.126 port 57350
Sep 30 07:47:32 compute-0 sshd-session[225715]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:47:32 compute-0 sshd-session[225715]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=52.224.109.126
Sep 30 07:47:32 compute-0 nova_compute[189265]: 2025-09-30 07:47:32.811 2 DEBUG oslo_concurrency.lockutils [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Releasing lock "refresh_cache-3096df7e-a792-4d1d-b3bb-e97838a49460" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:47:33 compute-0 nova_compute[189265]: 2025-09-30 07:47:33.330 2 DEBUG oslo_concurrency.lockutils [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:47:33 compute-0 nova_compute[189265]: 2025-09-30 07:47:33.330 2 DEBUG oslo_concurrency.lockutils [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:47:33 compute-0 nova_compute[189265]: 2025-09-30 07:47:33.331 2 DEBUG oslo_concurrency.lockutils [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:47:33 compute-0 nova_compute[189265]: 2025-09-30 07:47:33.336 2 INFO nova.virt.libvirt.driver [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3096df7e-a792-4d1d-b3bb-e97838a49460] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Sep 30 07:47:33 compute-0 virtqemud[189090]: Domain id=24 name='instance-0000001e' uuid=3096df7e-a792-4d1d-b3bb-e97838a49460 is tainted: custom-monitor
Sep 30 07:47:33 compute-0 sshd-session[225717]: Invalid user vagrant from 52.224.109.126 port 51328
Sep 30 07:47:33 compute-0 sshd-session[225717]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:47:33 compute-0 sshd-session[225717]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=52.224.109.126
Sep 30 07:47:33 compute-0 sshd-session[225711]: Connection closed by invalid user tomcat 52.224.109.126 port 57330 [preauth]
Sep 30 07:47:33 compute-0 sshd-session[225713]: Failed password for invalid user elsearch from 52.224.109.126 port 57344 ssh2
Sep 30 07:47:34 compute-0 sshd-session[225715]: Failed password for invalid user git from 52.224.109.126 port 57350 ssh2
Sep 30 07:47:34 compute-0 sshd-session[225719]: Invalid user esuser from 52.224.109.126 port 51340
Sep 30 07:47:34 compute-0 sshd-session[225715]: Connection closed by invalid user git 52.224.109.126 port 57350 [preauth]
Sep 30 07:47:34 compute-0 sshd-session[225719]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:47:34 compute-0 sshd-session[225719]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=52.224.109.126
Sep 30 07:47:34 compute-0 nova_compute[189265]: 2025-09-30 07:47:34.346 2 INFO nova.virt.libvirt.driver [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3096df7e-a792-4d1d-b3bb-e97838a49460] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Sep 30 07:47:34 compute-0 sshd-session[225717]: Failed password for invalid user vagrant from 52.224.109.126 port 51328 ssh2
Sep 30 07:47:34 compute-0 nova_compute[189265]: 2025-09-30 07:47:34.945 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:47:35 compute-0 sshd-session[225713]: Connection closed by invalid user elsearch 52.224.109.126 port 57344 [preauth]
Sep 30 07:47:35 compute-0 sshd[124648]: drop connection #2 from [52.224.109.126]:51346 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:35 compute-0 nova_compute[189265]: 2025-09-30 07:47:35.353 2 INFO nova.virt.libvirt.driver [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3096df7e-a792-4d1d-b3bb-e97838a49460] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Sep 30 07:47:35 compute-0 nova_compute[189265]: 2025-09-30 07:47:35.360 2 DEBUG nova.compute.manager [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3096df7e-a792-4d1d-b3bb-e97838a49460] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 07:47:35 compute-0 sshd-session[225719]: Failed password for invalid user esuser from 52.224.109.126 port 51340 ssh2
Sep 30 07:47:35 compute-0 sshd-session[225717]: Connection closed by invalid user vagrant 52.224.109.126 port 51328 [preauth]
Sep 30 07:47:35 compute-0 nova_compute[189265]: 2025-09-30 07:47:35.871 2 DEBUG nova.objects.instance [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3096df7e-a792-4d1d-b3bb-e97838a49460] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Sep 30 07:47:36 compute-0 sshd[124648]: drop connection #1 from [52.224.109.126]:51352 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:36 compute-0 sshd-session[225719]: Connection closed by invalid user esuser 52.224.109.126 port 51340 [preauth]
Sep 30 07:47:36 compute-0 nova_compute[189265]: 2025-09-30 07:47:36.893 2 WARNING neutronclient.v2_0.client [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:47:36 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:51356 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:37 compute-0 nova_compute[189265]: 2025-09-30 07:47:37.238 2 WARNING neutronclient.v2_0.client [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:47:37 compute-0 nova_compute[189265]: 2025-09-30 07:47:37.239 2 WARNING neutronclient.v2_0.client [None req-85a25344-1808-4858-8de5-38fcf0345faa e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:47:37 compute-0 nova_compute[189265]: 2025-09-30 07:47:37.346 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:47:37 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:51358 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:38 compute-0 podman[225721]: 2025-09-30 07:47:38.511118501 +0000 UTC m=+0.087261200 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=iscsid, org.label-schema.build-date=20250930)
Sep 30 07:47:38 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:51362 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:39 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:51376 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:39 compute-0 nova_compute[189265]: 2025-09-30 07:47:39.947 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:47:40 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:51388 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:41 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:51404 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:42 compute-0 nova_compute[189265]: 2025-09-30 07:47:42.382 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:47:42 compute-0 podman[225742]: 2025-09-30 07:47:42.477939473 +0000 UTC m=+0.062961749 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1755695350, config_id=edpm, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, version=9.6)
Sep 30 07:47:42 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:51412 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:43 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:58902 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:44 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:58908 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:44 compute-0 nova_compute[189265]: 2025-09-30 07:47:44.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:47:45 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:58916 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:45 compute-0 podman[225764]: 2025-09-30 07:47:45.503794891 +0000 UTC m=+0.084800569 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Sep 30 07:47:45 compute-0 podman[225765]: 2025-09-30 07:47:45.520104421 +0000 UTC m=+0.092422239 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.build-date=20250930)
Sep 30 07:47:45 compute-0 podman[225766]: 2025-09-30 07:47:45.538276206 +0000 UTC m=+0.117932025 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller)
Sep 30 07:47:46 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:58922 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:47 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:58928 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:47 compute-0 nova_compute[189265]: 2025-09-30 07:47:47.415 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:47:47 compute-0 nova_compute[189265]: 2025-09-30 07:47:47.978 2 DEBUG oslo_concurrency.lockutils [None req-4a473ff4-5fb4-4ab5-b9d4-f71de37b3e9b bf911d50b77e4e20a250e642038c8043 dd45f15fbdba414c8d395e5ff149cbc4 - - default default] Acquiring lock "3096df7e-a792-4d1d-b3bb-e97838a49460" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:47:47 compute-0 nova_compute[189265]: 2025-09-30 07:47:47.979 2 DEBUG oslo_concurrency.lockutils [None req-4a473ff4-5fb4-4ab5-b9d4-f71de37b3e9b bf911d50b77e4e20a250e642038c8043 dd45f15fbdba414c8d395e5ff149cbc4 - - default default] Lock "3096df7e-a792-4d1d-b3bb-e97838a49460" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:47:47 compute-0 nova_compute[189265]: 2025-09-30 07:47:47.979 2 DEBUG oslo_concurrency.lockutils [None req-4a473ff4-5fb4-4ab5-b9d4-f71de37b3e9b bf911d50b77e4e20a250e642038c8043 dd45f15fbdba414c8d395e5ff149cbc4 - - default default] Acquiring lock "3096df7e-a792-4d1d-b3bb-e97838a49460-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:47:47 compute-0 nova_compute[189265]: 2025-09-30 07:47:47.979 2 DEBUG oslo_concurrency.lockutils [None req-4a473ff4-5fb4-4ab5-b9d4-f71de37b3e9b bf911d50b77e4e20a250e642038c8043 dd45f15fbdba414c8d395e5ff149cbc4 - - default default] Lock "3096df7e-a792-4d1d-b3bb-e97838a49460-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:47:47 compute-0 nova_compute[189265]: 2025-09-30 07:47:47.979 2 DEBUG oslo_concurrency.lockutils [None req-4a473ff4-5fb4-4ab5-b9d4-f71de37b3e9b bf911d50b77e4e20a250e642038c8043 dd45f15fbdba414c8d395e5ff149cbc4 - - default default] Lock "3096df7e-a792-4d1d-b3bb-e97838a49460-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:47:47 compute-0 nova_compute[189265]: 2025-09-30 07:47:47.992 2 INFO nova.compute.manager [None req-4a473ff4-5fb4-4ab5-b9d4-f71de37b3e9b bf911d50b77e4e20a250e642038c8043 dd45f15fbdba414c8d395e5ff149cbc4 - - default default] [instance: 3096df7e-a792-4d1d-b3bb-e97838a49460] Terminating instance
Sep 30 07:47:48 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:58936 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:48 compute-0 nova_compute[189265]: 2025-09-30 07:47:48.510 2 DEBUG nova.compute.manager [None req-4a473ff4-5fb4-4ab5-b9d4-f71de37b3e9b bf911d50b77e4e20a250e642038c8043 dd45f15fbdba414c8d395e5ff149cbc4 - - default default] [instance: 3096df7e-a792-4d1d-b3bb-e97838a49460] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 07:47:48 compute-0 kernel: tapcc096913-0e (unregistering): left promiscuous mode
Sep 30 07:47:48 compute-0 NetworkManager[51813]: <info>  [1759218468.5358] device (tapcc096913-0e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 07:47:48 compute-0 ovn_controller[91436]: 2025-09-30T07:47:48Z|00289|binding|INFO|Releasing lport cc096913-0e34-4900-9703-81fac95bfd92 from this chassis (sb_readonly=0)
Sep 30 07:47:48 compute-0 ovn_controller[91436]: 2025-09-30T07:47:48Z|00290|binding|INFO|Setting lport cc096913-0e34-4900-9703-81fac95bfd92 down in Southbound
Sep 30 07:47:48 compute-0 ovn_controller[91436]: 2025-09-30T07:47:48Z|00291|binding|INFO|Removing iface tapcc096913-0e ovn-installed in OVS
Sep 30 07:47:48 compute-0 nova_compute[189265]: 2025-09-30 07:47:48.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:47:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:48.557 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:48:af:61 10.100.0.3'], port_security=['fa:16:3e:48:af:61 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '3096df7e-a792-4d1d-b3bb-e97838a49460', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b5627ebe-9328-432f-88fb-b5b539662efd', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'dd45f15fbdba414c8d395e5ff149cbc4', 'neutron:revision_number': '15', 'neutron:security_group_ids': '3b4f5d53-fbd5-497e-8555-6b61b9a4d332', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d4956bab-94e0-436a-b508-eeb3061671e6, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], logical_port=cc096913-0e34-4900-9703-81fac95bfd92) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:47:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:48.558 100322 INFO neutron.agent.ovn.metadata.agent [-] Port cc096913-0e34-4900-9703-81fac95bfd92 in datapath b5627ebe-9328-432f-88fb-b5b539662efd unbound from our chassis
Sep 30 07:47:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:48.560 100322 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b5627ebe-9328-432f-88fb-b5b539662efd, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 07:47:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:48.561 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[145da421-7eaa-4ee4-80be-c7e1aec43450]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:47:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:48.561 100322 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b5627ebe-9328-432f-88fb-b5b539662efd namespace which is not needed anymore
Sep 30 07:47:48 compute-0 nova_compute[189265]: 2025-09-30 07:47:48.569 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:47:48 compute-0 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d0000001e.scope: Deactivated successfully.
Sep 30 07:47:48 compute-0 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d0000001e.scope: Consumed 2.785s CPU time.
Sep 30 07:47:48 compute-0 systemd-machined[149233]: Machine qemu-24-instance-0000001e terminated.
Sep 30 07:47:48 compute-0 neutron-haproxy-ovnmeta-b5627ebe-9328-432f-88fb-b5b539662efd[225680]: [NOTICE]   (225684) : haproxy version is 3.0.5-8e879a5
Sep 30 07:47:48 compute-0 neutron-haproxy-ovnmeta-b5627ebe-9328-432f-88fb-b5b539662efd[225680]: [NOTICE]   (225684) : path to executable is /usr/sbin/haproxy
Sep 30 07:47:48 compute-0 neutron-haproxy-ovnmeta-b5627ebe-9328-432f-88fb-b5b539662efd[225680]: [WARNING]  (225684) : Exiting Master process...
Sep 30 07:47:48 compute-0 podman[225850]: 2025-09-30 07:47:48.697137734 +0000 UTC m=+0.044117485 container kill b0289028dae91514e613445149a02af8fd8a740e4efe5ba963046d9741d69b66 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-b5627ebe-9328-432f-88fb-b5b539662efd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 07:47:48 compute-0 neutron-haproxy-ovnmeta-b5627ebe-9328-432f-88fb-b5b539662efd[225680]: [ALERT]    (225684) : Current worker (225686) exited with code 143 (Terminated)
Sep 30 07:47:48 compute-0 neutron-haproxy-ovnmeta-b5627ebe-9328-432f-88fb-b5b539662efd[225680]: [WARNING]  (225684) : All workers exited. Exiting... (0)
Sep 30 07:47:48 compute-0 systemd[1]: libpod-b0289028dae91514e613445149a02af8fd8a740e4efe5ba963046d9741d69b66.scope: Deactivated successfully.
Sep 30 07:47:48 compute-0 nova_compute[189265]: 2025-09-30 07:47:48.727 2 DEBUG nova.compute.manager [req-bf70bdc8-04d2-4497-ab02-14f9db14cd62 req-a360de5e-b6a1-4ed3-9d46-a0fac942245a 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3096df7e-a792-4d1d-b3bb-e97838a49460] Received event network-vif-unplugged-cc096913-0e34-4900-9703-81fac95bfd92 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:47:48 compute-0 nova_compute[189265]: 2025-09-30 07:47:48.729 2 DEBUG oslo_concurrency.lockutils [req-bf70bdc8-04d2-4497-ab02-14f9db14cd62 req-a360de5e-b6a1-4ed3-9d46-a0fac942245a 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "3096df7e-a792-4d1d-b3bb-e97838a49460-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:47:48 compute-0 nova_compute[189265]: 2025-09-30 07:47:48.729 2 DEBUG oslo_concurrency.lockutils [req-bf70bdc8-04d2-4497-ab02-14f9db14cd62 req-a360de5e-b6a1-4ed3-9d46-a0fac942245a 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "3096df7e-a792-4d1d-b3bb-e97838a49460-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:47:48 compute-0 nova_compute[189265]: 2025-09-30 07:47:48.730 2 DEBUG oslo_concurrency.lockutils [req-bf70bdc8-04d2-4497-ab02-14f9db14cd62 req-a360de5e-b6a1-4ed3-9d46-a0fac942245a 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "3096df7e-a792-4d1d-b3bb-e97838a49460-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:47:48 compute-0 nova_compute[189265]: 2025-09-30 07:47:48.730 2 DEBUG nova.compute.manager [req-bf70bdc8-04d2-4497-ab02-14f9db14cd62 req-a360de5e-b6a1-4ed3-9d46-a0fac942245a 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3096df7e-a792-4d1d-b3bb-e97838a49460] No waiting events found dispatching network-vif-unplugged-cc096913-0e34-4900-9703-81fac95bfd92 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:47:48 compute-0 nova_compute[189265]: 2025-09-30 07:47:48.730 2 DEBUG nova.compute.manager [req-bf70bdc8-04d2-4497-ab02-14f9db14cd62 req-a360de5e-b6a1-4ed3-9d46-a0fac942245a 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3096df7e-a792-4d1d-b3bb-e97838a49460] Received event network-vif-unplugged-cc096913-0e34-4900-9703-81fac95bfd92 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:47:48 compute-0 podman[225867]: 2025-09-30 07:47:48.752234585 +0000 UTC m=+0.023549321 container died b0289028dae91514e613445149a02af8fd8a740e4efe5ba963046d9741d69b66 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-b5627ebe-9328-432f-88fb-b5b539662efd, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 07:47:48 compute-0 systemd[1]: var-lib-containers-storage-overlay-d311a95a22c6d94b875e59cec054dc63c39b6db02a7f01b8ec48673d071aae2f-merged.mount: Deactivated successfully.
Sep 30 07:47:48 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b0289028dae91514e613445149a02af8fd8a740e4efe5ba963046d9741d69b66-userdata-shm.mount: Deactivated successfully.
Sep 30 07:47:48 compute-0 nova_compute[189265]: 2025-09-30 07:47:48.793 2 INFO nova.virt.libvirt.driver [-] [instance: 3096df7e-a792-4d1d-b3bb-e97838a49460] Instance destroyed successfully.
Sep 30 07:47:48 compute-0 nova_compute[189265]: 2025-09-30 07:47:48.794 2 DEBUG nova.objects.instance [None req-4a473ff4-5fb4-4ab5-b9d4-f71de37b3e9b bf911d50b77e4e20a250e642038c8043 dd45f15fbdba414c8d395e5ff149cbc4 - - default default] Lazy-loading 'resources' on Instance uuid 3096df7e-a792-4d1d-b3bb-e97838a49460 obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:47:48 compute-0 podman[225867]: 2025-09-30 07:47:48.810547568 +0000 UTC m=+0.081862284 container remove b0289028dae91514e613445149a02af8fd8a740e4efe5ba963046d9741d69b66 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-b5627ebe-9328-432f-88fb-b5b539662efd, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930)
Sep 30 07:47:48 compute-0 systemd[1]: libpod-conmon-b0289028dae91514e613445149a02af8fd8a740e4efe5ba963046d9741d69b66.scope: Deactivated successfully.
Sep 30 07:47:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:48.819 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[2c58266e-af04-407e-b97c-34953e4e5e33]: (4, ("Tue Sep 30 07:47:48 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-b5627ebe-9328-432f-88fb-b5b539662efd (b0289028dae91514e613445149a02af8fd8a740e4efe5ba963046d9741d69b66)\nb0289028dae91514e613445149a02af8fd8a740e4efe5ba963046d9741d69b66\nTue Sep 30 07:47:48 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-b5627ebe-9328-432f-88fb-b5b539662efd (b0289028dae91514e613445149a02af8fd8a740e4efe5ba963046d9741d69b66)\nb0289028dae91514e613445149a02af8fd8a740e4efe5ba963046d9741d69b66\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:47:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:48.820 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[b729586f-2102-4566-a223-a9c44aa1c6f4]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:47:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:48.821 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b5627ebe-9328-432f-88fb-b5b539662efd.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b5627ebe-9328-432f-88fb-b5b539662efd.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:47:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:48.822 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[474010a3-8f3f-4f61-9ed3-7e1eb985cbed]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:47:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:48.822 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5627ebe-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:47:48 compute-0 kernel: tapb5627ebe-90: left promiscuous mode
Sep 30 07:47:48 compute-0 nova_compute[189265]: 2025-09-30 07:47:48.824 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:47:48 compute-0 nova_compute[189265]: 2025-09-30 07:47:48.840 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:47:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:48.846 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[ea8bc7af-d4ee-4168-99b7-00536f2070b3]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:47:48 compute-0 nova_compute[189265]: 2025-09-30 07:47:48.848 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:47:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:48.880 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[cd1350ed-9c45-4095-b644-cd5a7e0f63e9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:47:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:48.881 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[f1de3017-fe44-4d75-8f61-9c1f165d854c]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:47:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:48.895 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[7ca13f86-3cc2-4237-9a9e-5c4a241f0e2a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 630439, 'reachable_time': 16816, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225915, 'error': None, 'target': 'ovnmeta-b5627ebe-9328-432f-88fb-b5b539662efd', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:47:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:48.898 100440 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b5627ebe-9328-432f-88fb-b5b539662efd deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Sep 30 07:47:48 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:47:48.898 100440 DEBUG oslo.privsep.daemon [-] privsep: reply[2e92b9a1-69af-4203-9390-dd0eb2af1c19]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:47:48 compute-0 systemd[1]: run-netns-ovnmeta\x2db5627ebe\x2d9328\x2d432f\x2d88fb\x2db5b539662efd.mount: Deactivated successfully.
Sep 30 07:47:49 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:58952 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:49 compute-0 nova_compute[189265]: 2025-09-30 07:47:49.302 2 DEBUG nova.virt.libvirt.vif [None req-4a473ff4-5fb4-4ab5-b9d4-f71de37b3e9b bf911d50b77e4e20a250e642038c8043 dd45f15fbdba414c8d395e5ff149cbc4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-09-30T07:46:15Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteWorkloadBalanceStrategy-server-117756898',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(3),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecuteworkloadbalancestrategy-server-117756898',id=30,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=3,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T07:46:33Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=1151,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='dd45f15fbdba414c8d395e5ff149cbc4',ramdisk_id='',reservation_id='r-b0y6rjar',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member,manager',clean_attempts='1',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteWorkloadBalanceStrategy-359850667',owner_user_name='tempest-TestExecuteWorkloadBalanceStrategy-359850667-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T07:47:36Z,user_data=None,user_id='bf911d50b77e4e20a250e642038c8043',uuid=3096df7e-a792-4d1d-b3bb-e97838a49460,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cc096913-0e34-4900-9703-81fac95bfd92", "address": "fa:16:3e:48:af:61", "network": {"id": "b5627ebe-9328-432f-88fb-b5b539662efd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-337845770-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6633f775c5d46dc9c6c213b63954b2f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc096913-0e", "ovs_interfaceid": "cc096913-0e34-4900-9703-81fac95bfd92", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 07:47:49 compute-0 nova_compute[189265]: 2025-09-30 07:47:49.302 2 DEBUG nova.network.os_vif_util [None req-4a473ff4-5fb4-4ab5-b9d4-f71de37b3e9b bf911d50b77e4e20a250e642038c8043 dd45f15fbdba414c8d395e5ff149cbc4 - - default default] Converting VIF {"id": "cc096913-0e34-4900-9703-81fac95bfd92", "address": "fa:16:3e:48:af:61", "network": {"id": "b5627ebe-9328-432f-88fb-b5b539662efd", "bridge": "br-int", "label": "tempest-TestExecuteWorkloadBalanceStrategy-337845770-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "e6633f775c5d46dc9c6c213b63954b2f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcc096913-0e", "ovs_interfaceid": "cc096913-0e34-4900-9703-81fac95bfd92", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:47:49 compute-0 nova_compute[189265]: 2025-09-30 07:47:49.303 2 DEBUG nova.network.os_vif_util [None req-4a473ff4-5fb4-4ab5-b9d4-f71de37b3e9b bf911d50b77e4e20a250e642038c8043 dd45f15fbdba414c8d395e5ff149cbc4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:48:af:61,bridge_name='br-int',has_traffic_filtering=True,id=cc096913-0e34-4900-9703-81fac95bfd92,network=Network(b5627ebe-9328-432f-88fb-b5b539662efd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc096913-0e') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:47:49 compute-0 nova_compute[189265]: 2025-09-30 07:47:49.303 2 DEBUG os_vif [None req-4a473ff4-5fb4-4ab5-b9d4-f71de37b3e9b bf911d50b77e4e20a250e642038c8043 dd45f15fbdba414c8d395e5ff149cbc4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:48:af:61,bridge_name='br-int',has_traffic_filtering=True,id=cc096913-0e34-4900-9703-81fac95bfd92,network=Network(b5627ebe-9328-432f-88fb-b5b539662efd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc096913-0e') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 07:47:49 compute-0 nova_compute[189265]: 2025-09-30 07:47:49.305 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:47:49 compute-0 nova_compute[189265]: 2025-09-30 07:47:49.305 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcc096913-0e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:47:49 compute-0 nova_compute[189265]: 2025-09-30 07:47:49.307 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:47:49 compute-0 nova_compute[189265]: 2025-09-30 07:47:49.309 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:47:49 compute-0 nova_compute[189265]: 2025-09-30 07:47:49.309 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:47:49 compute-0 nova_compute[189265]: 2025-09-30 07:47:49.310 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=59b27f01-b417-4fc3-bd9c-529adc037d3a) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:47:49 compute-0 nova_compute[189265]: 2025-09-30 07:47:49.311 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:47:49 compute-0 nova_compute[189265]: 2025-09-30 07:47:49.314 2 INFO os_vif [None req-4a473ff4-5fb4-4ab5-b9d4-f71de37b3e9b bf911d50b77e4e20a250e642038c8043 dd45f15fbdba414c8d395e5ff149cbc4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:48:af:61,bridge_name='br-int',has_traffic_filtering=True,id=cc096913-0e34-4900-9703-81fac95bfd92,network=Network(b5627ebe-9328-432f-88fb-b5b539662efd),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcc096913-0e')
Sep 30 07:47:49 compute-0 nova_compute[189265]: 2025-09-30 07:47:49.315 2 INFO nova.virt.libvirt.driver [None req-4a473ff4-5fb4-4ab5-b9d4-f71de37b3e9b bf911d50b77e4e20a250e642038c8043 dd45f15fbdba414c8d395e5ff149cbc4 - - default default] [instance: 3096df7e-a792-4d1d-b3bb-e97838a49460] Deleting instance files /var/lib/nova/instances/3096df7e-a792-4d1d-b3bb-e97838a49460_del
Sep 30 07:47:49 compute-0 nova_compute[189265]: 2025-09-30 07:47:49.316 2 INFO nova.virt.libvirt.driver [None req-4a473ff4-5fb4-4ab5-b9d4-f71de37b3e9b bf911d50b77e4e20a250e642038c8043 dd45f15fbdba414c8d395e5ff149cbc4 - - default default] [instance: 3096df7e-a792-4d1d-b3bb-e97838a49460] Deletion of /var/lib/nova/instances/3096df7e-a792-4d1d-b3bb-e97838a49460_del complete
Sep 30 07:47:49 compute-0 nova_compute[189265]: 2025-09-30 07:47:49.830 2 INFO nova.compute.manager [None req-4a473ff4-5fb4-4ab5-b9d4-f71de37b3e9b bf911d50b77e4e20a250e642038c8043 dd45f15fbdba414c8d395e5ff149cbc4 - - default default] [instance: 3096df7e-a792-4d1d-b3bb-e97838a49460] Took 1.32 seconds to destroy the instance on the hypervisor.
Sep 30 07:47:49 compute-0 nova_compute[189265]: 2025-09-30 07:47:49.831 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-4a473ff4-5fb4-4ab5-b9d4-f71de37b3e9b bf911d50b77e4e20a250e642038c8043 dd45f15fbdba414c8d395e5ff149cbc4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 07:47:49 compute-0 nova_compute[189265]: 2025-09-30 07:47:49.832 2 DEBUG nova.compute.manager [-] [instance: 3096df7e-a792-4d1d-b3bb-e97838a49460] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 07:47:49 compute-0 nova_compute[189265]: 2025-09-30 07:47:49.832 2 DEBUG nova.network.neutron [-] [instance: 3096df7e-a792-4d1d-b3bb-e97838a49460] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 07:47:49 compute-0 nova_compute[189265]: 2025-09-30 07:47:49.832 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:47:49 compute-0 nova_compute[189265]: 2025-09-30 07:47:49.954 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:47:50 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:58968 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:50 compute-0 nova_compute[189265]: 2025-09-30 07:47:50.268 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:47:50 compute-0 nova_compute[189265]: 2025-09-30 07:47:50.596 2 DEBUG nova.compute.manager [req-092e98a5-f9f0-4dc5-adbe-f3745c8c2871 req-16fe0dd8-bd16-443c-8b4c-e84295803293 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3096df7e-a792-4d1d-b3bb-e97838a49460] Received event network-vif-deleted-cc096913-0e34-4900-9703-81fac95bfd92 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:47:50 compute-0 nova_compute[189265]: 2025-09-30 07:47:50.596 2 INFO nova.compute.manager [req-092e98a5-f9f0-4dc5-adbe-f3745c8c2871 req-16fe0dd8-bd16-443c-8b4c-e84295803293 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3096df7e-a792-4d1d-b3bb-e97838a49460] Neutron deleted interface cc096913-0e34-4900-9703-81fac95bfd92; detaching it from the instance and deleting it from the info cache
Sep 30 07:47:50 compute-0 nova_compute[189265]: 2025-09-30 07:47:50.596 2 DEBUG nova.network.neutron [req-092e98a5-f9f0-4dc5-adbe-f3745c8c2871 req-16fe0dd8-bd16-443c-8b4c-e84295803293 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3096df7e-a792-4d1d-b3bb-e97838a49460] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:47:50 compute-0 nova_compute[189265]: 2025-09-30 07:47:50.775 2 DEBUG nova.compute.manager [req-fc4bc90e-4800-4e38-9ba0-6ff68e49d5fc req-844c56c0-778b-4230-9aab-0480677acadd 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3096df7e-a792-4d1d-b3bb-e97838a49460] Received event network-vif-unplugged-cc096913-0e34-4900-9703-81fac95bfd92 external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:47:50 compute-0 nova_compute[189265]: 2025-09-30 07:47:50.776 2 DEBUG oslo_concurrency.lockutils [req-fc4bc90e-4800-4e38-9ba0-6ff68e49d5fc req-844c56c0-778b-4230-9aab-0480677acadd 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "3096df7e-a792-4d1d-b3bb-e97838a49460-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:47:50 compute-0 nova_compute[189265]: 2025-09-30 07:47:50.776 2 DEBUG oslo_concurrency.lockutils [req-fc4bc90e-4800-4e38-9ba0-6ff68e49d5fc req-844c56c0-778b-4230-9aab-0480677acadd 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "3096df7e-a792-4d1d-b3bb-e97838a49460-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:47:50 compute-0 nova_compute[189265]: 2025-09-30 07:47:50.777 2 DEBUG oslo_concurrency.lockutils [req-fc4bc90e-4800-4e38-9ba0-6ff68e49d5fc req-844c56c0-778b-4230-9aab-0480677acadd 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "3096df7e-a792-4d1d-b3bb-e97838a49460-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:47:50 compute-0 nova_compute[189265]: 2025-09-30 07:47:50.777 2 DEBUG nova.compute.manager [req-fc4bc90e-4800-4e38-9ba0-6ff68e49d5fc req-844c56c0-778b-4230-9aab-0480677acadd 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3096df7e-a792-4d1d-b3bb-e97838a49460] No waiting events found dispatching network-vif-unplugged-cc096913-0e34-4900-9703-81fac95bfd92 pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:47:50 compute-0 nova_compute[189265]: 2025-09-30 07:47:50.778 2 DEBUG nova.compute.manager [req-fc4bc90e-4800-4e38-9ba0-6ff68e49d5fc req-844c56c0-778b-4230-9aab-0480677acadd 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3096df7e-a792-4d1d-b3bb-e97838a49460] Received event network-vif-unplugged-cc096913-0e34-4900-9703-81fac95bfd92 for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:47:51 compute-0 nova_compute[189265]: 2025-09-30 07:47:51.034 2 DEBUG nova.network.neutron [-] [instance: 3096df7e-a792-4d1d-b3bb-e97838a49460] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:47:51 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:58978 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:51 compute-0 nova_compute[189265]: 2025-09-30 07:47:51.104 2 DEBUG nova.compute.manager [req-092e98a5-f9f0-4dc5-adbe-f3745c8c2871 req-16fe0dd8-bd16-443c-8b4c-e84295803293 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 3096df7e-a792-4d1d-b3bb-e97838a49460] Detach interface failed, port_id=cc096913-0e34-4900-9703-81fac95bfd92, reason: Instance 3096df7e-a792-4d1d-b3bb-e97838a49460 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Sep 30 07:47:51 compute-0 nova_compute[189265]: 2025-09-30 07:47:51.542 2 INFO nova.compute.manager [-] [instance: 3096df7e-a792-4d1d-b3bb-e97838a49460] Took 1.71 seconds to deallocate network for instance.
Sep 30 07:47:51 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:58990 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:52 compute-0 nova_compute[189265]: 2025-09-30 07:47:52.066 2 DEBUG oslo_concurrency.lockutils [None req-4a473ff4-5fb4-4ab5-b9d4-f71de37b3e9b bf911d50b77e4e20a250e642038c8043 dd45f15fbdba414c8d395e5ff149cbc4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:47:52 compute-0 nova_compute[189265]: 2025-09-30 07:47:52.066 2 DEBUG oslo_concurrency.lockutils [None req-4a473ff4-5fb4-4ab5-b9d4-f71de37b3e9b bf911d50b77e4e20a250e642038c8043 dd45f15fbdba414c8d395e5ff149cbc4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:47:52 compute-0 nova_compute[189265]: 2025-09-30 07:47:52.073 2 DEBUG oslo_concurrency.lockutils [None req-4a473ff4-5fb4-4ab5-b9d4-f71de37b3e9b bf911d50b77e4e20a250e642038c8043 dd45f15fbdba414c8d395e5ff149cbc4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:47:52 compute-0 nova_compute[189265]: 2025-09-30 07:47:52.120 2 INFO nova.scheduler.client.report [None req-4a473ff4-5fb4-4ab5-b9d4-f71de37b3e9b bf911d50b77e4e20a250e642038c8043 dd45f15fbdba414c8d395e5ff149cbc4 - - default default] Deleted allocations for instance 3096df7e-a792-4d1d-b3bb-e97838a49460
Sep 30 07:47:52 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:59004 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:53 compute-0 nova_compute[189265]: 2025-09-30 07:47:53.152 2 DEBUG oslo_concurrency.lockutils [None req-4a473ff4-5fb4-4ab5-b9d4-f71de37b3e9b bf911d50b77e4e20a250e642038c8043 dd45f15fbdba414c8d395e5ff149cbc4 - - default default] Lock "3096df7e-a792-4d1d-b3bb-e97838a49460" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.173s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:47:53 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:47480 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:54 compute-0 nova_compute[189265]: 2025-09-30 07:47:54.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:47:54 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:47488 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:54 compute-0 nova_compute[189265]: 2025-09-30 07:47:54.959 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:47:55 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:47494 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:56 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:47508 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:57 compute-0 nova_compute[189265]: 2025-09-30 07:47:57.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:47:57 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:47514 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:57 compute-0 nova_compute[189265]: 2025-09-30 07:47:57.783 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:47:58 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:47520 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:58 compute-0 podman[225918]: 2025-09-30 07:47:58.490809761 +0000 UTC m=+0.072249337 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 07:47:58 compute-0 nova_compute[189265]: 2025-09-30 07:47:58.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:47:59 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:47528 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:47:59 compute-0 nova_compute[189265]: 2025-09-30 07:47:59.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:47:59 compute-0 podman[199733]: time="2025-09-30T07:47:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:47:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:47:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:47:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:47:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3017 "" "Go-http-client/1.1"
Sep 30 07:47:59 compute-0 nova_compute[189265]: 2025-09-30 07:47:59.960 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:48:00 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:47540 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:48:00 compute-0 nova_compute[189265]: 2025-09-30 07:48:00.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:48:01 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:47556 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:48:01 compute-0 openstack_network_exporter[201859]: ERROR   07:48:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:48:01 compute-0 openstack_network_exporter[201859]: ERROR   07:48:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:48:01 compute-0 openstack_network_exporter[201859]: ERROR   07:48:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:48:01 compute-0 openstack_network_exporter[201859]: ERROR   07:48:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:48:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:48:01 compute-0 openstack_network_exporter[201859]: ERROR   07:48:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:48:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:48:02 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:47566 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:48:03 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:44616 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:48:03 compute-0 nova_compute[189265]: 2025-09-30 07:48:03.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:48:03 compute-0 nova_compute[189265]: 2025-09-30 07:48:03.788 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:48:04 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:44622 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:48:04 compute-0 nova_compute[189265]: 2025-09-30 07:48:04.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:48:04 compute-0 nova_compute[189265]: 2025-09-30 07:48:04.963 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:48:04 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:44626 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:48:05 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:44634 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:48:06 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:44638 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:48:07 compute-0 sshd-session[225943]: Invalid user oracle from 52.224.109.126 port 44650
Sep 30 07:48:07 compute-0 nova_compute[189265]: 2025-09-30 07:48:07.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:48:07 compute-0 nova_compute[189265]: 2025-09-30 07:48:07.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:48:07 compute-0 sshd-session[225943]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:48:07 compute-0 sshd-session[225943]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=52.224.109.126
Sep 30 07:48:08 compute-0 nova_compute[189265]: 2025-09-30 07:48:08.304 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:48:08 compute-0 nova_compute[189265]: 2025-09-30 07:48:08.305 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:48:08 compute-0 nova_compute[189265]: 2025-09-30 07:48:08.305 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:48:08 compute-0 nova_compute[189265]: 2025-09-30 07:48:08.306 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:48:08 compute-0 nova_compute[189265]: 2025-09-30 07:48:08.560 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:48:08 compute-0 nova_compute[189265]: 2025-09-30 07:48:08.562 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:48:08 compute-0 nova_compute[189265]: 2025-09-30 07:48:08.600 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.038s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:48:08 compute-0 nova_compute[189265]: 2025-09-30 07:48:08.601 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5845MB free_disk=73.30352783203125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:48:08 compute-0 nova_compute[189265]: 2025-09-30 07:48:08.602 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:48:08 compute-0 nova_compute[189265]: 2025-09-30 07:48:08.602 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:48:08 compute-0 sshd-session[225947]: Invalid user rabbitmq from 52.224.109.126 port 44656
Sep 30 07:48:08 compute-0 sshd-session[225947]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:48:08 compute-0 sshd-session[225947]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=52.224.109.126
Sep 30 07:48:08 compute-0 podman[225949]: 2025-09-30 07:48:08.839714077 +0000 UTC m=+0.096114665 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Sep 30 07:48:09 compute-0 nova_compute[189265]: 2025-09-30 07:48:09.318 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:48:09 compute-0 unix_chkpwd[225973]: password check failed for user (root)
Sep 30 07:48:09 compute-0 sshd-session[225971]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=52.224.109.126  user=root
Sep 30 07:48:09 compute-0 nova_compute[189265]: 2025-09-30 07:48:09.673 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:48:09 compute-0 nova_compute[189265]: 2025-09-30 07:48:09.674 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:48:08 up  1:45,  0 user,  load average: 0.32, 0.30, 0.29\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:48:09 compute-0 nova_compute[189265]: 2025-09-30 07:48:09.711 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:48:09 compute-0 nova_compute[189265]: 2025-09-30 07:48:09.965 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:48:10 compute-0 sshd-session[225943]: Failed password for invalid user oracle from 52.224.109.126 port 44650 ssh2
Sep 30 07:48:10 compute-0 nova_compute[189265]: 2025-09-30 07:48:10.223 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:48:10 compute-0 sshd-session[225943]: Connection closed by invalid user oracle 52.224.109.126 port 44650 [preauth]
Sep 30 07:48:10 compute-0 unix_chkpwd[225976]: password check failed for user (root)
Sep 30 07:48:10 compute-0 sshd-session[225974]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=52.224.109.126  user=root
Sep 30 07:48:10 compute-0 nova_compute[189265]: 2025-09-30 07:48:10.737 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:48:10 compute-0 nova_compute[189265]: 2025-09-30 07:48:10.738 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.135s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:48:10 compute-0 sshd-session[225947]: Failed password for invalid user rabbitmq from 52.224.109.126 port 44656 ssh2
Sep 30 07:48:10 compute-0 sshd-session[225971]: Failed password for root from 52.224.109.126 port 44660 ssh2
Sep 30 07:48:11 compute-0 unix_chkpwd[225979]: password check failed for user (root)
Sep 30 07:48:11 compute-0 sshd-session[225977]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=52.224.109.126  user=root
Sep 30 07:48:11 compute-0 sshd-session[225971]: Connection closed by authenticating user root 52.224.109.126 port 44660 [preauth]
Sep 30 07:48:11 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:48:11.707 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:b0:f2 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-89d0b86f-ec24-4260-97f2-66b301709d8f', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-89d0b86f-ec24-4260-97f2-66b301709d8f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fb73e37201b943a0807f29d1235cab63', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=04eb0fdd-5bef-44fc-882c-b6c1a37f89e6, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=243a35b1-4c43-42fa-a9fb-38c61744b625) old=Port_Binding(mac=['fa:16:3e:1e:b0:f2'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-89d0b86f-ec24-4260-97f2-66b301709d8f', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-89d0b86f-ec24-4260-97f2-66b301709d8f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fb73e37201b943a0807f29d1235cab63', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:48:11 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:48:11.709 100322 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 243a35b1-4c43-42fa-a9fb-38c61744b625 in datapath 89d0b86f-ec24-4260-97f2-66b301709d8f updated
Sep 30 07:48:11 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:48:11.711 100322 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 89d0b86f-ec24-4260-97f2-66b301709d8f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 07:48:11 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:48:11.712 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[cdb2dc2c-7863-4734-8560-7ffde734cb3d]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:48:11 compute-0 nova_compute[189265]: 2025-09-30 07:48:11.739 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:48:12 compute-0 sshd-session[225974]: Failed password for root from 52.224.109.126 port 44666 ssh2
Sep 30 07:48:12 compute-0 unix_chkpwd[225982]: password check failed for user (root)
Sep 30 07:48:12 compute-0 sshd-session[225980]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=52.224.109.126  user=root
Sep 30 07:48:12 compute-0 sshd-session[225974]: Connection closed by authenticating user root 52.224.109.126 port 44666 [preauth]
Sep 30 07:48:12 compute-0 sshd-session[225947]: Connection closed by invalid user rabbitmq 52.224.109.126 port 44656 [preauth]
Sep 30 07:48:13 compute-0 sshd[124648]: drop connection #2 from [52.224.109.126]:34242 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:48:13 compute-0 podman[225983]: 2025-09-30 07:48:13.501557212 +0000 UTC m=+0.084450829 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.openshift.expose-services=, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-type=git, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64)
Sep 30 07:48:13 compute-0 sshd-session[225977]: Failed password for root from 52.224.109.126 port 44668 ssh2
Sep 30 07:48:14 compute-0 sshd[124648]: drop connection #2 from [52.224.109.126]:34244 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:48:14 compute-0 nova_compute[189265]: 2025-09-30 07:48:14.321 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:48:14 compute-0 sshd-session[225980]: Failed password for root from 52.224.109.126 port 44680 ssh2
Sep 30 07:48:14 compute-0 sshd-session[225980]: Connection closed by authenticating user root 52.224.109.126 port 44680 [preauth]
Sep 30 07:48:14 compute-0 nova_compute[189265]: 2025-09-30 07:48:14.966 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:48:15 compute-0 sshd[124648]: drop connection #1 from [52.224.109.126]:34254 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:48:15 compute-0 sshd-session[225977]: Connection closed by authenticating user root 52.224.109.126 port 44668 [preauth]
Sep 30 07:48:15 compute-0 nova_compute[189265]: 2025-09-30 07:48:15.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:48:16 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:34256 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:48:16 compute-0 podman[226004]: 2025-09-30 07:48:16.512291014 +0000 UTC m=+0.089483494 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4)
Sep 30 07:48:16 compute-0 podman[226005]: 2025-09-30 07:48:16.529261724 +0000 UTC m=+0.100033649 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 07:48:16 compute-0 podman[226006]: 2025-09-30 07:48:16.580411841 +0000 UTC m=+0.135779611 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Sep 30 07:48:17 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:34272 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:48:18 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:34286 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:48:18 compute-0 nova_compute[189265]: 2025-09-30 07:48:18.783 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:48:18 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:34290 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:48:19 compute-0 nova_compute[189265]: 2025-09-30 07:48:19.323 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:48:19 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:34304 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:48:19 compute-0 nova_compute[189265]: 2025-09-30 07:48:19.994 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:48:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:48:20.595 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:48:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:48:20.596 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:48:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:48:20.596 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:48:20 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:34312 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:48:21 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:34324 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:48:22 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:48:22.284 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e6:b5:f3 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-7b493897-a341-4922-ac15-43adfc00513f', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b493897-a341-4922-ac15-43adfc00513f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '22dc878beb8b401eb3fafde0ee425207', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=37c94646-f784-4823-b08f-48f99779bd12, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=f86d6ac1-e25d-4839-8477-fb7c8a4c2822) old=Port_Binding(mac=['fa:16:3e:e6:b5:f3'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-7b493897-a341-4922-ac15-43adfc00513f', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7b493897-a341-4922-ac15-43adfc00513f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '22dc878beb8b401eb3fafde0ee425207', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:48:22 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:48:22.285 100322 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port f86d6ac1-e25d-4839-8477-fb7c8a4c2822 in datapath 7b493897-a341-4922-ac15-43adfc00513f updated
Sep 30 07:48:22 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:48:22.287 100322 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7b493897-a341-4922-ac15-43adfc00513f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 07:48:22 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:48:22.288 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[3804da0b-8058-47f9-b99d-dab4e6723da2]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:48:22 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:34330 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:48:23 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:51332 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:48:24 compute-0 nova_compute[189265]: 2025-09-30 07:48:24.324 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:48:24 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:51340 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:48:24 compute-0 nova_compute[189265]: 2025-09-30 07:48:24.995 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:48:25 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:48:25.087 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=34, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '1a:26:7c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2e:60:fa:91:d0:34'}, ipsec=False) old=SB_Global(nb_cfg=33) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:48:25 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:48:25.088 100322 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 07:48:25 compute-0 nova_compute[189265]: 2025-09-30 07:48:25.089 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:48:25 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:51348 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:48:26 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:51352 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:48:27 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:51362 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:48:28 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:48:28.090 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=01429670-4ea1-4dab-babc-4bc628cc01bb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '34'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:48:28 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:51378 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:48:29 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:51392 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:48:29 compute-0 nova_compute[189265]: 2025-09-30 07:48:29.326 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:48:29 compute-0 podman[226073]: 2025-09-30 07:48:29.493168217 +0000 UTC m=+0.076071437 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 07:48:29 compute-0 podman[199733]: time="2025-09-30T07:48:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:48:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:48:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:48:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:48:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3016 "" "Go-http-client/1.1"
Sep 30 07:48:29 compute-0 nova_compute[189265]: 2025-09-30 07:48:29.996 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:48:30 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:51402 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:48:31 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:51414 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:48:31 compute-0 ovn_controller[91436]: 2025-09-30T07:48:31Z|00292|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Sep 30 07:48:31 compute-0 openstack_network_exporter[201859]: ERROR   07:48:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:48:31 compute-0 openstack_network_exporter[201859]: ERROR   07:48:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:48:31 compute-0 openstack_network_exporter[201859]: ERROR   07:48:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:48:31 compute-0 openstack_network_exporter[201859]: ERROR   07:48:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:48:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:48:31 compute-0 openstack_network_exporter[201859]: ERROR   07:48:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:48:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:48:32 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:51426 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:48:33 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:52580 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:48:34 compute-0 sshd[124648]: drop connection #0 from [52.224.109.126]:52602 on [38.102.83.22]:22 penalty: failed authentication
Sep 30 07:48:34 compute-0 nova_compute[189265]: 2025-09-30 07:48:34.329 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:48:34 compute-0 nova_compute[189265]: 2025-09-30 07:48:34.999 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:48:39 compute-0 nova_compute[189265]: 2025-09-30 07:48:39.331 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:48:39 compute-0 podman[226097]: 2025-09-30 07:48:39.49867179 +0000 UTC m=+0.079328731 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=iscsid, container_name=iscsid)
Sep 30 07:48:40 compute-0 nova_compute[189265]: 2025-09-30 07:48:40.002 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:48:43 compute-0 unix_chkpwd[226119]: password check failed for user (root)
Sep 30 07:48:43 compute-0 sshd-session[226117]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=152.32.144.167  user=root
Sep 30 07:48:44 compute-0 nova_compute[189265]: 2025-09-30 07:48:44.333 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:48:44 compute-0 podman[226120]: 2025-09-30 07:48:44.483463458 +0000 UTC m=+0.075283914 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.buildah.version=1.33.7, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=ubi9-minimal, container_name=openstack_network_exporter)
Sep 30 07:48:45 compute-0 nova_compute[189265]: 2025-09-30 07:48:45.004 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:48:45 compute-0 sshd-session[226117]: Failed password for root from 152.32.144.167 port 53168 ssh2
Sep 30 07:48:47 compute-0 podman[226141]: 2025-09-30 07:48:47.516841434 +0000 UTC m=+0.092890372 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 07:48:47 compute-0 podman[226142]: 2025-09-30 07:48:47.541209598 +0000 UTC m=+0.112756596 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 07:48:47 compute-0 podman[226143]: 2025-09-30 07:48:47.601515429 +0000 UTC m=+0.169508595 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4)
Sep 30 07:48:47 compute-0 sshd-session[226117]: Received disconnect from 152.32.144.167 port 53168:11: Bye Bye [preauth]
Sep 30 07:48:47 compute-0 sshd-session[226117]: Disconnected from authenticating user root 152.32.144.167 port 53168 [preauth]
Sep 30 07:48:49 compute-0 nova_compute[189265]: 2025-09-30 07:48:49.335 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:48:50 compute-0 nova_compute[189265]: 2025-09-30 07:48:50.006 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:48:54 compute-0 nova_compute[189265]: 2025-09-30 07:48:54.375 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:48:55 compute-0 nova_compute[189265]: 2025-09-30 07:48:55.008 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:48:58 compute-0 nova_compute[189265]: 2025-09-30 07:48:58.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:48:58 compute-0 nova_compute[189265]: 2025-09-30 07:48:58.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:48:59 compute-0 nova_compute[189265]: 2025-09-30 07:48:59.377 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:48:59 compute-0 podman[199733]: time="2025-09-30T07:48:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:48:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:48:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:48:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:48:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3014 "" "Go-http-client/1.1"
Sep 30 07:49:00 compute-0 nova_compute[189265]: 2025-09-30 07:49:00.010 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:49:00 compute-0 podman[226202]: 2025-09-30 07:49:00.502757272 +0000 UTC m=+0.073118702 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 07:49:00 compute-0 nova_compute[189265]: 2025-09-30 07:49:00.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:49:01 compute-0 openstack_network_exporter[201859]: ERROR   07:49:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:49:01 compute-0 openstack_network_exporter[201859]: ERROR   07:49:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:49:01 compute-0 openstack_network_exporter[201859]: ERROR   07:49:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:49:01 compute-0 openstack_network_exporter[201859]: ERROR   07:49:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:49:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:49:01 compute-0 openstack_network_exporter[201859]: ERROR   07:49:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:49:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:49:04 compute-0 nova_compute[189265]: 2025-09-30 07:49:04.380 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:49:05 compute-0 nova_compute[189265]: 2025-09-30 07:49:05.012 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:49:05 compute-0 nova_compute[189265]: 2025-09-30 07:49:05.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:49:05 compute-0 nova_compute[189265]: 2025-09-30 07:49:05.788 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:49:08 compute-0 nova_compute[189265]: 2025-09-30 07:49:08.789 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:49:09 compute-0 nova_compute[189265]: 2025-09-30 07:49:09.382 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:49:09 compute-0 nova_compute[189265]: 2025-09-30 07:49:09.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:49:09 compute-0 nova_compute[189265]: 2025-09-30 07:49:09.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:49:10 compute-0 nova_compute[189265]: 2025-09-30 07:49:10.055 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:49:10 compute-0 nova_compute[189265]: 2025-09-30 07:49:10.302 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:49:10 compute-0 nova_compute[189265]: 2025-09-30 07:49:10.303 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:49:10 compute-0 nova_compute[189265]: 2025-09-30 07:49:10.303 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:49:10 compute-0 nova_compute[189265]: 2025-09-30 07:49:10.304 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:49:10 compute-0 podman[226228]: 2025-09-30 07:49:10.51227419 +0000 UTC m=+0.092698876 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=iscsid, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=iscsid, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 07:49:10 compute-0 nova_compute[189265]: 2025-09-30 07:49:10.534 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:49:10 compute-0 nova_compute[189265]: 2025-09-30 07:49:10.535 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:49:10 compute-0 nova_compute[189265]: 2025-09-30 07:49:10.551 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.016s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:49:10 compute-0 nova_compute[189265]: 2025-09-30 07:49:10.551 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5846MB free_disk=73.30358123779297GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:49:10 compute-0 nova_compute[189265]: 2025-09-30 07:49:10.551 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:49:10 compute-0 nova_compute[189265]: 2025-09-30 07:49:10.552 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:49:11 compute-0 nova_compute[189265]: 2025-09-30 07:49:11.605 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:49:11 compute-0 nova_compute[189265]: 2025-09-30 07:49:11.606 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:49:10 up  1:46,  0 user,  load average: 0.12, 0.25, 0.27\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:49:11 compute-0 nova_compute[189265]: 2025-09-30 07:49:11.635 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:49:12 compute-0 nova_compute[189265]: 2025-09-30 07:49:12.144 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:49:12 compute-0 nova_compute[189265]: 2025-09-30 07:49:12.658 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:49:12 compute-0 nova_compute[189265]: 2025-09-30 07:49:12.658 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.107s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:49:12 compute-0 nova_compute[189265]: 2025-09-30 07:49:12.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:49:14 compute-0 nova_compute[189265]: 2025-09-30 07:49:14.385 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:49:15 compute-0 nova_compute[189265]: 2025-09-30 07:49:15.056 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:49:15 compute-0 podman[226249]: 2025-09-30 07:49:15.504827752 +0000 UTC m=+0.085287613 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.component=ubi9-minimal-container, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal)
Sep 30 07:49:17 compute-0 nova_compute[189265]: 2025-09-30 07:49:17.295 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:49:18 compute-0 podman[226271]: 2025-09-30 07:49:18.525785959 +0000 UTC m=+0.097520946 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0)
Sep 30 07:49:18 compute-0 podman[226272]: 2025-09-30 07:49:18.527793987 +0000 UTC m=+0.093699826 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 07:49:18 compute-0 podman[226273]: 2025-09-30 07:49:18.578327486 +0000 UTC m=+0.137254633 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team)
Sep 30 07:49:19 compute-0 nova_compute[189265]: 2025-09-30 07:49:19.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:49:20 compute-0 nova_compute[189265]: 2025-09-30 07:49:20.057 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:49:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:49:20.597 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:49:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:49:20.598 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:49:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:49:20.598 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:49:20 compute-0 nova_compute[189265]: 2025-09-30 07:49:20.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:49:20 compute-0 nova_compute[189265]: 2025-09-30 07:49:20.789 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Sep 30 07:49:24 compute-0 nova_compute[189265]: 2025-09-30 07:49:24.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:49:25 compute-0 nova_compute[189265]: 2025-09-30 07:49:25.058 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:49:25 compute-0 unix_chkpwd[226338]: password check failed for user (root)
Sep 30 07:49:25 compute-0 sshd-session[226336]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.57.223.153  user=root
Sep 30 07:49:27 compute-0 sshd-session[226336]: Failed password for root from 103.57.223.153 port 59678 ssh2
Sep 30 07:49:27 compute-0 sshd-session[226336]: Received disconnect from 103.57.223.153 port 59678:11: Bye Bye [preauth]
Sep 30 07:49:27 compute-0 sshd-session[226336]: Disconnected from authenticating user root 103.57.223.153 port 59678 [preauth]
Sep 30 07:49:29 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Sep 30 07:49:29 compute-0 nova_compute[189265]: 2025-09-30 07:49:29.393 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:49:29 compute-0 podman[199733]: time="2025-09-30T07:49:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:49:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:49:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:49:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:49:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3018 "" "Go-http-client/1.1"
Sep 30 07:49:30 compute-0 nova_compute[189265]: 2025-09-30 07:49:30.061 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:49:31 compute-0 openstack_network_exporter[201859]: ERROR   07:49:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:49:31 compute-0 openstack_network_exporter[201859]: ERROR   07:49:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:49:31 compute-0 openstack_network_exporter[201859]: ERROR   07:49:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:49:31 compute-0 openstack_network_exporter[201859]: ERROR   07:49:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:49:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:49:31 compute-0 openstack_network_exporter[201859]: ERROR   07:49:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:49:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:49:31 compute-0 podman[226340]: 2025-09-30 07:49:31.501006719 +0000 UTC m=+0.075400478 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 07:49:32 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:49:32.554 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=35, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '1a:26:7c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2e:60:fa:91:d0:34'}, ipsec=False) old=SB_Global(nb_cfg=34) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:49:32 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:49:32.555 100322 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 07:49:32 compute-0 nova_compute[189265]: 2025-09-30 07:49:32.555 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:49:34 compute-0 nova_compute[189265]: 2025-09-30 07:49:34.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:49:35 compute-0 nova_compute[189265]: 2025-09-30 07:49:35.065 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:49:38 compute-0 nova_compute[189265]: 2025-09-30 07:49:38.295 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:49:38 compute-0 nova_compute[189265]: 2025-09-30 07:49:38.296 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Sep 30 07:49:38 compute-0 nova_compute[189265]: 2025-09-30 07:49:38.808 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Sep 30 07:49:39 compute-0 nova_compute[189265]: 2025-09-30 07:49:39.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:49:40 compute-0 nova_compute[189265]: 2025-09-30 07:49:40.119 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:49:41 compute-0 podman[226365]: 2025-09-30 07:49:41.503302504 +0000 UTC m=+0.079723750 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=iscsid, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0)
Sep 30 07:49:42 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:49:42.556 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=01429670-4ea1-4dab-babc-4bc628cc01bb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '35'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:49:44 compute-0 nova_compute[189265]: 2025-09-30 07:49:44.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:49:45 compute-0 nova_compute[189265]: 2025-09-30 07:49:45.123 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:49:46 compute-0 podman[226386]: 2025-09-30 07:49:46.505960296 +0000 UTC m=+0.079697380 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vendor=Red Hat, Inc., config_id=edpm, maintainer=Red Hat, Inc., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.expose-services=, release=1755695350, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Sep 30 07:49:49 compute-0 nova_compute[189265]: 2025-09-30 07:49:49.402 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:49:49 compute-0 podman[226408]: 2025-09-30 07:49:49.49350714 +0000 UTC m=+0.072409510 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Sep 30 07:49:49 compute-0 podman[226409]: 2025-09-30 07:49:49.496675491 +0000 UTC m=+0.064122681 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 07:49:49 compute-0 podman[226415]: 2025-09-30 07:49:49.555366194 +0000 UTC m=+0.111325052 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=watcher_latest, container_name=ovn_controller)
Sep 30 07:49:50 compute-0 nova_compute[189265]: 2025-09-30 07:49:50.123 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:49:54 compute-0 nova_compute[189265]: 2025-09-30 07:49:54.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:49:55 compute-0 nova_compute[189265]: 2025-09-30 07:49:55.125 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:49:59 compute-0 nova_compute[189265]: 2025-09-30 07:49:59.476 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:49:59 compute-0 podman[199733]: time="2025-09-30T07:49:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:49:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:49:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:49:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:49:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3012 "" "Go-http-client/1.1"
Sep 30 07:50:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:50:00.111 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bf:21:3a 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3bc30bc6516c4e49aed5726171c74d6f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a9ce9f20-a5bb-4f10-94ac-caac3fb7e1ad, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=dcaa81b8-98f0-4a50-8978-177847c9169e) old=Port_Binding(mac=['fa:16:3e:bf:21:3a'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3bc30bc6516c4e49aed5726171c74d6f', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:50:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:50:00.112 100322 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port dcaa81b8-98f0-4a50-8978-177847c9169e in datapath fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f updated
Sep 30 07:50:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:50:00.113 100322 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 07:50:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:50:00.114 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[c70db4b8-5064-4fa1-af6b-7a40794ec52b]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:50:00 compute-0 nova_compute[189265]: 2025-09-30 07:50:00.127 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:50:00 compute-0 nova_compute[189265]: 2025-09-30 07:50:00.296 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:50:00 compute-0 nova_compute[189265]: 2025-09-30 07:50:00.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:50:00 compute-0 nova_compute[189265]: 2025-09-30 07:50:00.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:50:01 compute-0 openstack_network_exporter[201859]: ERROR   07:50:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:50:01 compute-0 openstack_network_exporter[201859]: ERROR   07:50:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:50:01 compute-0 openstack_network_exporter[201859]: ERROR   07:50:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:50:01 compute-0 openstack_network_exporter[201859]: ERROR   07:50:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:50:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:50:01 compute-0 openstack_network_exporter[201859]: ERROR   07:50:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:50:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:50:02 compute-0 podman[226471]: 2025-09-30 07:50:02.487768157 +0000 UTC m=+0.064771670 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 07:50:04 compute-0 nova_compute[189265]: 2025-09-30 07:50:04.479 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:50:05 compute-0 nova_compute[189265]: 2025-09-30 07:50:05.129 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:50:05 compute-0 nova_compute[189265]: 2025-09-30 07:50:05.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:50:05 compute-0 nova_compute[189265]: 2025-09-30 07:50:05.788 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:50:09 compute-0 nova_compute[189265]: 2025-09-30 07:50:09.481 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:50:09 compute-0 nova_compute[189265]: 2025-09-30 07:50:09.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:50:09 compute-0 nova_compute[189265]: 2025-09-30 07:50:09.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:50:09 compute-0 nova_compute[189265]: 2025-09-30 07:50:09.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:50:10 compute-0 nova_compute[189265]: 2025-09-30 07:50:10.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:50:10 compute-0 nova_compute[189265]: 2025-09-30 07:50:10.377 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:50:10 compute-0 nova_compute[189265]: 2025-09-30 07:50:10.378 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:50:10 compute-0 nova_compute[189265]: 2025-09-30 07:50:10.378 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:50:10 compute-0 nova_compute[189265]: 2025-09-30 07:50:10.378 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:50:10 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:50:10.588 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d0:8a:10 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-1a1c2b40-d02f-42ee-bd85-5f1fe02924e1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1a1c2b40-d02f-42ee-bd85-5f1fe02924e1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2d470809703a44e69c2bc0d283b2bce4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=933d4add-41be-40c4-ad80-ad3667e0d6ee, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=411f7b0a-48f4-4a23-9646-91f5d4eaa4cd) old=Port_Binding(mac=['fa:16:3e:d0:8a:10'], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-1a1c2b40-d02f-42ee-bd85-5f1fe02924e1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1a1c2b40-d02f-42ee-bd85-5f1fe02924e1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2d470809703a44e69c2bc0d283b2bce4', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:50:10 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:50:10.589 100322 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 411f7b0a-48f4-4a23-9646-91f5d4eaa4cd in datapath 1a1c2b40-d02f-42ee-bd85-5f1fe02924e1 updated
Sep 30 07:50:10 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:50:10.591 100322 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1a1c2b40-d02f-42ee-bd85-5f1fe02924e1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 07:50:10 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:50:10.592 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[2d02943a-189e-416e-a750-cf5bb3348886]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:50:10 compute-0 nova_compute[189265]: 2025-09-30 07:50:10.597 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:50:10 compute-0 nova_compute[189265]: 2025-09-30 07:50:10.599 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:50:10 compute-0 nova_compute[189265]: 2025-09-30 07:50:10.629 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.030s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:50:10 compute-0 nova_compute[189265]: 2025-09-30 07:50:10.631 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5847MB free_disk=73.30358123779297GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:50:10 compute-0 nova_compute[189265]: 2025-09-30 07:50:10.631 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:50:10 compute-0 nova_compute[189265]: 2025-09-30 07:50:10.632 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:50:11 compute-0 nova_compute[189265]: 2025-09-30 07:50:11.841 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:50:11 compute-0 nova_compute[189265]: 2025-09-30 07:50:11.841 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:50:10 up  1:47,  0 user,  load average: 0.08, 0.21, 0.25\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:50:11 compute-0 nova_compute[189265]: 2025-09-30 07:50:11.923 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Refreshing inventories for resource provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Sep 30 07:50:11 compute-0 nova_compute[189265]: 2025-09-30 07:50:11.999 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Updating ProviderTree inventory for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Sep 30 07:50:12 compute-0 nova_compute[189265]: 2025-09-30 07:50:11.999 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Updating inventory in ProviderTree for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Sep 30 07:50:12 compute-0 nova_compute[189265]: 2025-09-30 07:50:12.016 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Refreshing aggregate associations for resource provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Sep 30 07:50:12 compute-0 nova_compute[189265]: 2025-09-30 07:50:12.036 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Refreshing trait associations for resource provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc, traits: COMPUTE_SECURITY_TPM_CRB,HW_ARCH_X86_64,HW_CPU_X86_F16C,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AESNI,COMPUTE_STORAGE_VIRTIO_FS,HW_CPU_X86_FMA3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE,COMPUTE_ACCELERATORS,HW_CPU_X86_SVM,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_EXTEND,COMPUTE_ARCH_X86_64,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SHA,HW_CPU_X86_BMI,COMPUTE_SOUND_MODEL_USB,COMPUTE_SOUND_MODEL_SB16,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AMD_SVM,HW_CPU_X86_BMI2,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_TIS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_AVX,COMPUTE_SOUND_MODEL_AC97,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_ABM,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_VIF_MODEL_IGB,HW_CPU_X86_SSE41,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SOUND_MODEL_ICH6,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_MMX,HW_CPU_X86_SSE4A,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_PCSPK,HW_CPU_X86_CLMUL _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Sep 30 07:50:12 compute-0 nova_compute[189265]: 2025-09-30 07:50:12.058 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:50:12 compute-0 podman[226499]: 2025-09-30 07:50:12.479324204 +0000 UTC m=+0.062704200 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_id=iscsid, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS)
Sep 30 07:50:12 compute-0 nova_compute[189265]: 2025-09-30 07:50:12.565 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:50:13 compute-0 nova_compute[189265]: 2025-09-30 07:50:13.072 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:50:13 compute-0 nova_compute[189265]: 2025-09-30 07:50:13.072 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.440s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:50:14 compute-0 nova_compute[189265]: 2025-09-30 07:50:14.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:50:15 compute-0 nova_compute[189265]: 2025-09-30 07:50:15.163 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:50:17 compute-0 podman[226519]: 2025-09-30 07:50:17.524049286 +0000 UTC m=+0.105841984 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, config_id=edpm, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vendor=Red Hat, Inc., name=ubi9-minimal, container_name=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc.)
Sep 30 07:50:19 compute-0 nova_compute[189265]: 2025-09-30 07:50:19.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:50:20 compute-0 nova_compute[189265]: 2025-09-30 07:50:20.165 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:50:20 compute-0 podman[226542]: 2025-09-30 07:50:20.510464919 +0000 UTC m=+0.087860035 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20250930)
Sep 30 07:50:20 compute-0 podman[226543]: 2025-09-30 07:50:20.512677133 +0000 UTC m=+0.082974695 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 07:50:20 compute-0 podman[226544]: 2025-09-30 07:50:20.545258103 +0000 UTC m=+0.117917403 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Sep 30 07:50:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:50:20.599 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:50:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:50:20.599 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:50:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:50:20.599 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:50:22 compute-0 nova_compute[189265]: 2025-09-30 07:50:22.071 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:50:23 compute-0 nova_compute[189265]: 2025-09-30 07:50:23.783 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:50:24 compute-0 nova_compute[189265]: 2025-09-30 07:50:24.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:50:25 compute-0 nova_compute[189265]: 2025-09-30 07:50:25.167 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:50:29 compute-0 nova_compute[189265]: 2025-09-30 07:50:29.532 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:50:29 compute-0 podman[199733]: time="2025-09-30T07:50:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:50:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:50:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:50:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:50:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3014 "" "Go-http-client/1.1"
Sep 30 07:50:30 compute-0 nova_compute[189265]: 2025-09-30 07:50:30.171 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:50:31 compute-0 openstack_network_exporter[201859]: ERROR   07:50:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:50:31 compute-0 openstack_network_exporter[201859]: ERROR   07:50:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:50:31 compute-0 openstack_network_exporter[201859]: ERROR   07:50:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:50:31 compute-0 openstack_network_exporter[201859]: ERROR   07:50:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:50:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:50:31 compute-0 openstack_network_exporter[201859]: ERROR   07:50:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:50:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:50:33 compute-0 podman[226608]: 2025-09-30 07:50:33.496761366 +0000 UTC m=+0.075845669 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 07:50:34 compute-0 nova_compute[189265]: 2025-09-30 07:50:34.535 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:50:35 compute-0 nova_compute[189265]: 2025-09-30 07:50:35.177 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:50:37 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:50:37.777 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=36, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '1a:26:7c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2e:60:fa:91:d0:34'}, ipsec=False) old=SB_Global(nb_cfg=35) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:50:37 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:50:37.778 100322 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 07:50:37 compute-0 nova_compute[189265]: 2025-09-30 07:50:37.808 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:50:39 compute-0 nova_compute[189265]: 2025-09-30 07:50:39.538 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:50:40 compute-0 nova_compute[189265]: 2025-09-30 07:50:40.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:50:43 compute-0 podman[226633]: 2025-09-30 07:50:43.502615615 +0000 UTC m=+0.080166374 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=iscsid, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 07:50:44 compute-0 nova_compute[189265]: 2025-09-30 07:50:44.540 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:50:45 compute-0 nova_compute[189265]: 2025-09-30 07:50:45.216 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:50:45 compute-0 nova_compute[189265]: 2025-09-30 07:50:45.889 2 DEBUG oslo_concurrency.lockutils [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Acquiring lock "81f3990d-a6c1-43db-aa50-6b771758b8fb" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:50:45 compute-0 nova_compute[189265]: 2025-09-30 07:50:45.890 2 DEBUG oslo_concurrency.lockutils [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Lock "81f3990d-a6c1-43db-aa50-6b771758b8fb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:50:46 compute-0 nova_compute[189265]: 2025-09-30 07:50:46.396 2 DEBUG nova.compute.manager [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Starting instance... _do_build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2472
Sep 30 07:50:46 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:50:46.780 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=01429670-4ea1-4dab-babc-4bc628cc01bb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '36'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:50:46 compute-0 nova_compute[189265]: 2025-09-30 07:50:46.954 2 DEBUG oslo_concurrency.lockutils [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:50:46 compute-0 nova_compute[189265]: 2025-09-30 07:50:46.955 2 DEBUG oslo_concurrency.lockutils [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:50:46 compute-0 nova_compute[189265]: 2025-09-30 07:50:46.962 2 DEBUG nova.virt.hardware [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.12/site-packages/nova/virt/hardware.py:2528
Sep 30 07:50:46 compute-0 nova_compute[189265]: 2025-09-30 07:50:46.963 2 INFO nova.compute.claims [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Claim successful on node compute-0.ctlplane.example.com
Sep 30 07:50:48 compute-0 nova_compute[189265]: 2025-09-30 07:50:48.027 2 DEBUG nova.compute.provider_tree [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:50:48 compute-0 podman[226653]: 2025-09-30 07:50:48.525416587 +0000 UTC m=+0.101546270 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=ubi9-minimal-container, vcs-type=git, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_id=edpm, name=ubi9-minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, container_name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350)
Sep 30 07:50:48 compute-0 nova_compute[189265]: 2025-09-30 07:50:48.536 2 DEBUG nova.scheduler.client.report [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:50:49 compute-0 nova_compute[189265]: 2025-09-30 07:50:49.048 2 DEBUG oslo_concurrency.lockutils [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.093s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:50:49 compute-0 nova_compute[189265]: 2025-09-30 07:50:49.048 2 DEBUG nova.compute.manager [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2869
Sep 30 07:50:49 compute-0 nova_compute[189265]: 2025-09-30 07:50:49.542 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:50:49 compute-0 nova_compute[189265]: 2025-09-30 07:50:49.558 2 DEBUG nova.compute.manager [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2016
Sep 30 07:50:49 compute-0 nova_compute[189265]: 2025-09-30 07:50:49.558 2 DEBUG nova.network.neutron [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] allocate_for_instance() allocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1208
Sep 30 07:50:49 compute-0 nova_compute[189265]: 2025-09-30 07:50:49.559 2 WARNING neutronclient.v2_0.client [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:50:49 compute-0 nova_compute[189265]: 2025-09-30 07:50:49.559 2 WARNING neutronclient.v2_0.client [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:50:50 compute-0 nova_compute[189265]: 2025-09-30 07:50:50.065 2 INFO nova.virt.libvirt.driver [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Sep 30 07:50:50 compute-0 nova_compute[189265]: 2025-09-30 07:50:50.255 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:50:50 compute-0 nova_compute[189265]: 2025-09-30 07:50:50.574 2 DEBUG nova.compute.manager [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Start building block device mappings for instance. _build_resources /usr/lib/python3.12/site-packages/nova/compute/manager.py:2904
Sep 30 07:50:51 compute-0 podman[226677]: 2025-09-30 07:50:51.513911638 +0000 UTC m=+0.087437564 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.4)
Sep 30 07:50:51 compute-0 podman[226676]: 2025-09-30 07:50:51.525969485 +0000 UTC m=+0.104360701 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd)
Sep 30 07:50:51 compute-0 nova_compute[189265]: 2025-09-30 07:50:51.593 2 DEBUG nova.compute.manager [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:2678
Sep 30 07:50:51 compute-0 nova_compute[189265]: 2025-09-30 07:50:51.595 2 DEBUG nova.virt.libvirt.driver [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Creating instance directory _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5138
Sep 30 07:50:51 compute-0 nova_compute[189265]: 2025-09-30 07:50:51.596 2 INFO nova.virt.libvirt.driver [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Creating image(s)
Sep 30 07:50:51 compute-0 nova_compute[189265]: 2025-09-30 07:50:51.597 2 DEBUG oslo_concurrency.lockutils [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Acquiring lock "/var/lib/nova/instances/81f3990d-a6c1-43db-aa50-6b771758b8fb/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:50:51 compute-0 nova_compute[189265]: 2025-09-30 07:50:51.598 2 DEBUG oslo_concurrency.lockutils [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Lock "/var/lib/nova/instances/81f3990d-a6c1-43db-aa50-6b771758b8fb/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:50:51 compute-0 nova_compute[189265]: 2025-09-30 07:50:51.599 2 DEBUG oslo_concurrency.lockutils [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Lock "/var/lib/nova/instances/81f3990d-a6c1-43db-aa50-6b771758b8fb/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:50:51 compute-0 nova_compute[189265]: 2025-09-30 07:50:51.601 2 DEBUG oslo_utils.imageutils.format_inspector [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:50:51 compute-0 nova_compute[189265]: 2025-09-30 07:50:51.608 2 DEBUG oslo_utils.imageutils.format_inspector [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:50:51 compute-0 nova_compute[189265]: 2025-09-30 07:50:51.610 2 DEBUG oslo_concurrency.processutils [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:50:51 compute-0 podman[226678]: 2025-09-30 07:50:51.614565841 +0000 UTC m=+0.184078801 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_id=ovn_controller, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Sep 30 07:50:51 compute-0 nova_compute[189265]: 2025-09-30 07:50:51.675 2 DEBUG nova.network.neutron [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Successfully created port: 76fa05ca-c22d-48b8-b82e-ed72d1971c4c _create_port_minimal /usr/lib/python3.12/site-packages/nova/network/neutron.py:550
Sep 30 07:50:51 compute-0 nova_compute[189265]: 2025-09-30 07:50:51.685 2 DEBUG oslo_concurrency.processutils [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:50:51 compute-0 nova_compute[189265]: 2025-09-30 07:50:51.686 2 DEBUG oslo_concurrency.lockutils [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Acquiring lock "649c128805005f3dfb5a93843c58a367cdfe939d" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:50:51 compute-0 nova_compute[189265]: 2025-09-30 07:50:51.687 2 DEBUG oslo_concurrency.lockutils [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Lock "649c128805005f3dfb5a93843c58a367cdfe939d" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:50:51 compute-0 nova_compute[189265]: 2025-09-30 07:50:51.688 2 DEBUG oslo_utils.imageutils.format_inspector [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:50:51 compute-0 nova_compute[189265]: 2025-09-30 07:50:51.694 2 DEBUG oslo_utils.imageutils.format_inspector [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:50:51 compute-0 nova_compute[189265]: 2025-09-30 07:50:51.694 2 DEBUG oslo_concurrency.processutils [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:50:51 compute-0 nova_compute[189265]: 2025-09-30 07:50:51.779 2 DEBUG oslo_concurrency.processutils [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:50:51 compute-0 nova_compute[189265]: 2025-09-30 07:50:51.780 2 DEBUG oslo_concurrency.processutils [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d,backing_fmt=raw /var/lib/nova/instances/81f3990d-a6c1-43db-aa50-6b771758b8fb/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:50:51 compute-0 nova_compute[189265]: 2025-09-30 07:50:51.826 2 DEBUG oslo_concurrency.processutils [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d,backing_fmt=raw /var/lib/nova/instances/81f3990d-a6c1-43db-aa50-6b771758b8fb/disk 1073741824" returned: 0 in 0.046s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:50:51 compute-0 nova_compute[189265]: 2025-09-30 07:50:51.827 2 DEBUG oslo_concurrency.lockutils [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Lock "649c128805005f3dfb5a93843c58a367cdfe939d" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.140s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:50:51 compute-0 nova_compute[189265]: 2025-09-30 07:50:51.828 2 DEBUG oslo_concurrency.processutils [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:50:51 compute-0 nova_compute[189265]: 2025-09-30 07:50:51.917 2 DEBUG oslo_concurrency.processutils [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:50:51 compute-0 nova_compute[189265]: 2025-09-30 07:50:51.918 2 DEBUG nova.virt.disk.api [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Checking if we can resize image /var/lib/nova/instances/81f3990d-a6c1-43db-aa50-6b771758b8fb/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Sep 30 07:50:51 compute-0 nova_compute[189265]: 2025-09-30 07:50:51.919 2 DEBUG oslo_concurrency.processutils [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/81f3990d-a6c1-43db-aa50-6b771758b8fb/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:50:52 compute-0 nova_compute[189265]: 2025-09-30 07:50:52.001 2 DEBUG oslo_concurrency.processutils [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/81f3990d-a6c1-43db-aa50-6b771758b8fb/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:50:52 compute-0 nova_compute[189265]: 2025-09-30 07:50:52.003 2 DEBUG nova.virt.disk.api [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Cannot resize image /var/lib/nova/instances/81f3990d-a6c1-43db-aa50-6b771758b8fb/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Sep 30 07:50:52 compute-0 nova_compute[189265]: 2025-09-30 07:50:52.003 2 DEBUG nova.virt.libvirt.driver [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Created local disks _create_image /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5270
Sep 30 07:50:52 compute-0 nova_compute[189265]: 2025-09-30 07:50:52.004 2 DEBUG nova.virt.libvirt.driver [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Ensure instance console log exists: /var/lib/nova/instances/81f3990d-a6c1-43db-aa50-6b771758b8fb/console.log _ensure_console_log_for_instance /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5017
Sep 30 07:50:52 compute-0 nova_compute[189265]: 2025-09-30 07:50:52.005 2 DEBUG oslo_concurrency.lockutils [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:50:52 compute-0 nova_compute[189265]: 2025-09-30 07:50:52.005 2 DEBUG oslo_concurrency.lockutils [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:50:52 compute-0 nova_compute[189265]: 2025-09-30 07:50:52.006 2 DEBUG oslo_concurrency.lockutils [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:50:53 compute-0 nova_compute[189265]: 2025-09-30 07:50:53.661 2 DEBUG nova.network.neutron [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Successfully updated port: 76fa05ca-c22d-48b8-b82e-ed72d1971c4c _update_port /usr/lib/python3.12/site-packages/nova/network/neutron.py:588
Sep 30 07:50:53 compute-0 nova_compute[189265]: 2025-09-30 07:50:53.739 2 DEBUG nova.compute.manager [req-1ecdfe15-23ff-4efc-a82b-7c80e19bf30e req-aa96af16-fbe5-442a-b9e9-af7199eaff8c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Received event network-changed-76fa05ca-c22d-48b8-b82e-ed72d1971c4c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:50:53 compute-0 nova_compute[189265]: 2025-09-30 07:50:53.740 2 DEBUG nova.compute.manager [req-1ecdfe15-23ff-4efc-a82b-7c80e19bf30e req-aa96af16-fbe5-442a-b9e9-af7199eaff8c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Refreshing instance network info cache due to event network-changed-76fa05ca-c22d-48b8-b82e-ed72d1971c4c. external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11817
Sep 30 07:50:53 compute-0 nova_compute[189265]: 2025-09-30 07:50:53.740 2 DEBUG oslo_concurrency.lockutils [req-1ecdfe15-23ff-4efc-a82b-7c80e19bf30e req-aa96af16-fbe5-442a-b9e9-af7199eaff8c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "refresh_cache-81f3990d-a6c1-43db-aa50-6b771758b8fb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:50:53 compute-0 nova_compute[189265]: 2025-09-30 07:50:53.741 2 DEBUG oslo_concurrency.lockutils [req-1ecdfe15-23ff-4efc-a82b-7c80e19bf30e req-aa96af16-fbe5-442a-b9e9-af7199eaff8c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquired lock "refresh_cache-81f3990d-a6c1-43db-aa50-6b771758b8fb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:50:53 compute-0 nova_compute[189265]: 2025-09-30 07:50:53.741 2 DEBUG nova.network.neutron [req-1ecdfe15-23ff-4efc-a82b-7c80e19bf30e req-aa96af16-fbe5-442a-b9e9-af7199eaff8c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Refreshing network info cache for port 76fa05ca-c22d-48b8-b82e-ed72d1971c4c _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2067
Sep 30 07:50:54 compute-0 nova_compute[189265]: 2025-09-30 07:50:54.169 2 DEBUG oslo_concurrency.lockutils [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Acquiring lock "refresh_cache-81f3990d-a6c1-43db-aa50-6b771758b8fb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:50:54 compute-0 nova_compute[189265]: 2025-09-30 07:50:54.247 2 WARNING neutronclient.v2_0.client [req-1ecdfe15-23ff-4efc-a82b-7c80e19bf30e req-aa96af16-fbe5-442a-b9e9-af7199eaff8c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:50:54 compute-0 nova_compute[189265]: 2025-09-30 07:50:54.545 2 DEBUG nova.network.neutron [req-1ecdfe15-23ff-4efc-a82b-7c80e19bf30e req-aa96af16-fbe5-442a-b9e9-af7199eaff8c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 07:50:54 compute-0 nova_compute[189265]: 2025-09-30 07:50:54.582 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:50:54 compute-0 nova_compute[189265]: 2025-09-30 07:50:54.756 2 DEBUG nova.network.neutron [req-1ecdfe15-23ff-4efc-a82b-7c80e19bf30e req-aa96af16-fbe5-442a-b9e9-af7199eaff8c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:50:55 compute-0 nova_compute[189265]: 2025-09-30 07:50:55.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:50:55 compute-0 nova_compute[189265]: 2025-09-30 07:50:55.263 2 DEBUG oslo_concurrency.lockutils [req-1ecdfe15-23ff-4efc-a82b-7c80e19bf30e req-aa96af16-fbe5-442a-b9e9-af7199eaff8c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Releasing lock "refresh_cache-81f3990d-a6c1-43db-aa50-6b771758b8fb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:50:55 compute-0 nova_compute[189265]: 2025-09-30 07:50:55.264 2 DEBUG oslo_concurrency.lockutils [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Acquired lock "refresh_cache-81f3990d-a6c1-43db-aa50-6b771758b8fb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:50:55 compute-0 nova_compute[189265]: 2025-09-30 07:50:55.264 2 DEBUG nova.network.neutron [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 07:50:56 compute-0 nova_compute[189265]: 2025-09-30 07:50:56.013 2 DEBUG nova.network.neutron [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.12/site-packages/nova/network/neutron.py:3383
Sep 30 07:50:56 compute-0 nova_compute[189265]: 2025-09-30 07:50:56.266 2 WARNING neutronclient.v2_0.client [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:50:56 compute-0 nova_compute[189265]: 2025-09-30 07:50:56.443 2 DEBUG nova.network.neutron [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Updating instance_info_cache with network_info: [{"id": "76fa05ca-c22d-48b8-b82e-ed72d1971c4c", "address": "fa:16:3e:78:e7:1d", "network": {"id": "fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1009939787-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3bc30bc6516c4e49aed5726171c74d6f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap76fa05ca-c2", "ovs_interfaceid": "76fa05ca-c22d-48b8-b82e-ed72d1971c4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:50:56 compute-0 nova_compute[189265]: 2025-09-30 07:50:56.953 2 DEBUG oslo_concurrency.lockutils [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Releasing lock "refresh_cache-81f3990d-a6c1-43db-aa50-6b771758b8fb" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:50:56 compute-0 nova_compute[189265]: 2025-09-30 07:50:56.954 2 DEBUG nova.compute.manager [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Instance network_info: |[{"id": "76fa05ca-c22d-48b8-b82e-ed72d1971c4c", "address": "fa:16:3e:78:e7:1d", "network": {"id": "fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1009939787-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3bc30bc6516c4e49aed5726171c74d6f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap76fa05ca-c2", "ovs_interfaceid": "76fa05ca-c22d-48b8-b82e-ed72d1971c4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.12/site-packages/nova/compute/manager.py:2031
Sep 30 07:50:56 compute-0 nova_compute[189265]: 2025-09-30 07:50:56.957 2 DEBUG nova.virt.libvirt.driver [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Start _get_guest_xml network_info=[{"id": "76fa05ca-c22d-48b8-b82e-ed72d1971c4c", "address": "fa:16:3e:78:e7:1d", "network": {"id": "fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1009939787-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3bc30bc6516c4e49aed5726171c74d6f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap76fa05ca-c2", "ovs_interfaceid": "76fa05ca-c22d-48b8-b82e-ed72d1971c4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T07:07:59Z,direct_url=<?>,disk_format='qcow2',id=0c6b92f5-9861-49e4-862d-3ffd84520dfa,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4049964ce8244dacb50493f6676c6613',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T07:08:00Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_type': 'disk', 'device_name': '/dev/vda', 'size': 0, 'encryption_options': None, 'encryption_secret_uuid': None, 'guest_format': None, 'encryption_format': None, 'boot_index': 0, 'disk_bus': 'virtio', 'encrypted': False, 'image_id': '0c6b92f5-9861-49e4-862d-3ffd84520dfa'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None}share_info=None _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8046
Sep 30 07:50:56 compute-0 nova_compute[189265]: 2025-09-30 07:50:56.962 2 WARNING nova.virt.libvirt.driver [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:50:56 compute-0 nova_compute[189265]: 2025-09-30 07:50:56.964 2 DEBUG nova.virt.driver [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] InstanceDriverMetadata: InstanceDriverMetadata(root_type='image', root_id='0c6b92f5-9861-49e4-862d-3ffd84520dfa', instance_meta=NovaInstanceMeta(name='tempest-TestExecuteZoneMigrationStrategy-server-935657189', uuid='81f3990d-a6c1-43db-aa50-6b771758b8fb'), owner=OwnerMeta(userid='f267aaead8a3437a8359b21224982b1c', username='tempest-TestExecuteZoneMigrationStrategy-642893642-project-admin', projectid='2d470809703a44e69c2bc0d283b2bce4', projectname='tempest-TestExecuteZoneMigrationStrategy-642893642'), image=ImageMeta(id='0c6b92f5-9861-49e4-862d-3ffd84520dfa', name=None, container_format='bare', disk_format='qcow2', min_disk=1, min_ram=0, properties={'hw_rng_model': 'virtio'}), flavor=FlavorMeta(name='m1.nano', flavorid='ded17455-f8fe-40c7-8dae-6f0a2b208ae0', memory_mb=128, vcpus=1, root_gb=1, ephemeral_gb=0, extra_specs={'hw_rng:allowed': 'True'}, swap=0), network_info=[{"id": "76fa05ca-c22d-48b8-b82e-ed72d1971c4c", "address": "fa:16:3e:78:e7:1d", "network": {"id": "fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1009939787-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3bc30bc6516c4e49aed5726171c74d6f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap76fa05ca-c2", "ovs_interfaceid": "76fa05ca-c22d-48b8-b82e-ed72d1971c4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}], nova_package='32.1.0-0.20250919142712.b99a882.el10', creation_time=1759218656.9647667) get_instance_driver_metadata /usr/lib/python3.12/site-packages/nova/virt/driver.py:437
Sep 30 07:50:56 compute-0 nova_compute[189265]: 2025-09-30 07:50:56.970 2 DEBUG nova.virt.libvirt.host [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1698
Sep 30 07:50:56 compute-0 nova_compute[189265]: 2025-09-30 07:50:56.971 2 DEBUG nova.virt.libvirt.host [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1708
Sep 30 07:50:56 compute-0 nova_compute[189265]: 2025-09-30 07:50:56.975 2 DEBUG nova.virt.libvirt.host [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Searching host: 'compute-0.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1717
Sep 30 07:50:56 compute-0 nova_compute[189265]: 2025-09-30 07:50:56.976 2 DEBUG nova.virt.libvirt.host [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.12/site-packages/nova/virt/libvirt/host.py:1724
Sep 30 07:50:56 compute-0 nova_compute[189265]: 2025-09-30 07:50:56.977 2 DEBUG nova.virt.libvirt.driver [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:5809
Sep 30 07:50:56 compute-0 nova_compute[189265]: 2025-09-30 07:50:56.977 2 DEBUG nova.virt.hardware [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-09-30T07:07:57Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='ded17455-f8fe-40c7-8dae-6f0a2b208ae0',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-09-30T07:07:59Z,direct_url=<?>,disk_format='qcow2',id=0c6b92f5-9861-49e4-862d-3ffd84520dfa,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='4049964ce8244dacb50493f6676c6613',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-09-30T07:08:00Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:571
Sep 30 07:50:56 compute-0 nova_compute[189265]: 2025-09-30 07:50:56.978 2 DEBUG nova.virt.hardware [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:356
Sep 30 07:50:56 compute-0 nova_compute[189265]: 2025-09-30 07:50:56.979 2 DEBUG nova.virt.hardware [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:360
Sep 30 07:50:56 compute-0 nova_compute[189265]: 2025-09-30 07:50:56.979 2 DEBUG nova.virt.hardware [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:396
Sep 30 07:50:56 compute-0 nova_compute[189265]: 2025-09-30 07:50:56.979 2 DEBUG nova.virt.hardware [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:400
Sep 30 07:50:56 compute-0 nova_compute[189265]: 2025-09-30 07:50:56.980 2 DEBUG nova.virt.hardware [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.12/site-packages/nova/virt/hardware.py:438
Sep 30 07:50:56 compute-0 nova_compute[189265]: 2025-09-30 07:50:56.980 2 DEBUG nova.virt.hardware [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:577
Sep 30 07:50:56 compute-0 nova_compute[189265]: 2025-09-30 07:50:56.981 2 DEBUG nova.virt.hardware [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:479
Sep 30 07:50:56 compute-0 nova_compute[189265]: 2025-09-30 07:50:56.981 2 DEBUG nova.virt.hardware [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:509
Sep 30 07:50:56 compute-0 nova_compute[189265]: 2025-09-30 07:50:56.982 2 DEBUG nova.virt.hardware [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:583
Sep 30 07:50:56 compute-0 nova_compute[189265]: 2025-09-30 07:50:56.982 2 DEBUG nova.virt.hardware [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.12/site-packages/nova/virt/hardware.py:585
Sep 30 07:50:56 compute-0 nova_compute[189265]: 2025-09-30 07:50:56.988 2 DEBUG nova.virt.libvirt.vif [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T07:50:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-935657189',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-935657189',id=35,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2d470809703a44e69c2bc0d283b2bce4',ramdisk_id='',reservation_id='r-jwcbn16t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-642893642',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-642893642-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T07:50:50Z,user_data=None,user_id='f267aaead8a3437a8359b21224982b1c',uuid=81f3990d-a6c1-43db-aa50-6b771758b8fb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "76fa05ca-c22d-48b8-b82e-ed72d1971c4c", "address": "fa:16:3e:78:e7:1d", "network": {"id": "fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1009939787-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3bc30bc6516c4e49aed5726171c74d6f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap76fa05ca-c2", "ovs_interfaceid": "76fa05ca-c22d-48b8-b82e-ed72d1971c4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:574
Sep 30 07:50:56 compute-0 nova_compute[189265]: 2025-09-30 07:50:56.989 2 DEBUG nova.network.os_vif_util [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Converting VIF {"id": "76fa05ca-c22d-48b8-b82e-ed72d1971c4c", "address": "fa:16:3e:78:e7:1d", "network": {"id": "fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1009939787-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3bc30bc6516c4e49aed5726171c74d6f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap76fa05ca-c2", "ovs_interfaceid": "76fa05ca-c22d-48b8-b82e-ed72d1971c4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:50:56 compute-0 nova_compute[189265]: 2025-09-30 07:50:56.990 2 DEBUG nova.network.os_vif_util [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:78:e7:1d,bridge_name='br-int',has_traffic_filtering=True,id=76fa05ca-c22d-48b8-b82e-ed72d1971c4c,network=Network(fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap76fa05ca-c2') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:50:56 compute-0 nova_compute[189265]: 2025-09-30 07:50:56.991 2 DEBUG nova.objects.instance [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Lazy-loading 'pci_devices' on Instance uuid 81f3990d-a6c1-43db-aa50-6b771758b8fb obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:50:57 compute-0 nova_compute[189265]: 2025-09-30 07:50:57.501 2 DEBUG nova.virt.libvirt.driver [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] End _get_guest_xml xml=<domain type="kvm">
Sep 30 07:50:57 compute-0 nova_compute[189265]:   <uuid>81f3990d-a6c1-43db-aa50-6b771758b8fb</uuid>
Sep 30 07:50:57 compute-0 nova_compute[189265]:   <name>instance-00000023</name>
Sep 30 07:50:57 compute-0 nova_compute[189265]:   <memory>131072</memory>
Sep 30 07:50:57 compute-0 nova_compute[189265]:   <vcpu>1</vcpu>
Sep 30 07:50:57 compute-0 nova_compute[189265]:   <metadata>
Sep 30 07:50:57 compute-0 nova_compute[189265]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Sep 30 07:50:57 compute-0 nova_compute[189265]:       <nova:package version="32.1.0-0.20250919142712.b99a882.el10"/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:       <nova:name>tempest-TestExecuteZoneMigrationStrategy-server-935657189</nova:name>
Sep 30 07:50:57 compute-0 nova_compute[189265]:       <nova:creationTime>2025-09-30 07:50:56</nova:creationTime>
Sep 30 07:50:57 compute-0 nova_compute[189265]:       <nova:flavor name="m1.nano" id="ded17455-f8fe-40c7-8dae-6f0a2b208ae0">
Sep 30 07:50:57 compute-0 nova_compute[189265]:         <nova:memory>128</nova:memory>
Sep 30 07:50:57 compute-0 nova_compute[189265]:         <nova:disk>1</nova:disk>
Sep 30 07:50:57 compute-0 nova_compute[189265]:         <nova:swap>0</nova:swap>
Sep 30 07:50:57 compute-0 nova_compute[189265]:         <nova:ephemeral>0</nova:ephemeral>
Sep 30 07:50:57 compute-0 nova_compute[189265]:         <nova:vcpus>1</nova:vcpus>
Sep 30 07:50:57 compute-0 nova_compute[189265]:         <nova:extraSpecs>
Sep 30 07:50:57 compute-0 nova_compute[189265]:           <nova:extraSpec name="hw_rng:allowed">True</nova:extraSpec>
Sep 30 07:50:57 compute-0 nova_compute[189265]:         </nova:extraSpecs>
Sep 30 07:50:57 compute-0 nova_compute[189265]:       </nova:flavor>
Sep 30 07:50:57 compute-0 nova_compute[189265]:       <nova:image uuid="0c6b92f5-9861-49e4-862d-3ffd84520dfa">
Sep 30 07:50:57 compute-0 nova_compute[189265]:         <nova:containerFormat>bare</nova:containerFormat>
Sep 30 07:50:57 compute-0 nova_compute[189265]:         <nova:diskFormat>qcow2</nova:diskFormat>
Sep 30 07:50:57 compute-0 nova_compute[189265]:         <nova:minDisk>1</nova:minDisk>
Sep 30 07:50:57 compute-0 nova_compute[189265]:         <nova:minRam>0</nova:minRam>
Sep 30 07:50:57 compute-0 nova_compute[189265]:         <nova:properties>
Sep 30 07:50:57 compute-0 nova_compute[189265]:           <nova:property name="hw_rng_model">virtio</nova:property>
Sep 30 07:50:57 compute-0 nova_compute[189265]:         </nova:properties>
Sep 30 07:50:57 compute-0 nova_compute[189265]:       </nova:image>
Sep 30 07:50:57 compute-0 nova_compute[189265]:       <nova:owner>
Sep 30 07:50:57 compute-0 nova_compute[189265]:         <nova:user uuid="f267aaead8a3437a8359b21224982b1c">tempest-TestExecuteZoneMigrationStrategy-642893642-project-admin</nova:user>
Sep 30 07:50:57 compute-0 nova_compute[189265]:         <nova:project uuid="2d470809703a44e69c2bc0d283b2bce4">tempest-TestExecuteZoneMigrationStrategy-642893642</nova:project>
Sep 30 07:50:57 compute-0 nova_compute[189265]:       </nova:owner>
Sep 30 07:50:57 compute-0 nova_compute[189265]:       <nova:root type="image" uuid="0c6b92f5-9861-49e4-862d-3ffd84520dfa"/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:       <nova:ports>
Sep 30 07:50:57 compute-0 nova_compute[189265]:         <nova:port uuid="76fa05ca-c22d-48b8-b82e-ed72d1971c4c">
Sep 30 07:50:57 compute-0 nova_compute[189265]:           <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:         </nova:port>
Sep 30 07:50:57 compute-0 nova_compute[189265]:       </nova:ports>
Sep 30 07:50:57 compute-0 nova_compute[189265]:     </nova:instance>
Sep 30 07:50:57 compute-0 nova_compute[189265]:   </metadata>
Sep 30 07:50:57 compute-0 nova_compute[189265]:   <sysinfo type="smbios">
Sep 30 07:50:57 compute-0 nova_compute[189265]:     <system>
Sep 30 07:50:57 compute-0 nova_compute[189265]:       <entry name="manufacturer">RDO</entry>
Sep 30 07:50:57 compute-0 nova_compute[189265]:       <entry name="product">OpenStack Compute</entry>
Sep 30 07:50:57 compute-0 nova_compute[189265]:       <entry name="version">32.1.0-0.20250919142712.b99a882.el10</entry>
Sep 30 07:50:57 compute-0 nova_compute[189265]:       <entry name="serial">81f3990d-a6c1-43db-aa50-6b771758b8fb</entry>
Sep 30 07:50:57 compute-0 nova_compute[189265]:       <entry name="uuid">81f3990d-a6c1-43db-aa50-6b771758b8fb</entry>
Sep 30 07:50:57 compute-0 nova_compute[189265]:       <entry name="family">Virtual Machine</entry>
Sep 30 07:50:57 compute-0 nova_compute[189265]:     </system>
Sep 30 07:50:57 compute-0 nova_compute[189265]:   </sysinfo>
Sep 30 07:50:57 compute-0 nova_compute[189265]:   <os>
Sep 30 07:50:57 compute-0 nova_compute[189265]:     <type arch="x86_64" machine="q35">hvm</type>
Sep 30 07:50:57 compute-0 nova_compute[189265]:     <boot dev="hd"/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:     <smbios mode="sysinfo"/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:   </os>
Sep 30 07:50:57 compute-0 nova_compute[189265]:   <features>
Sep 30 07:50:57 compute-0 nova_compute[189265]:     <acpi/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:     <apic/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:     <vmcoreinfo/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:   </features>
Sep 30 07:50:57 compute-0 nova_compute[189265]:   <clock offset="utc">
Sep 30 07:50:57 compute-0 nova_compute[189265]:     <timer name="pit" tickpolicy="delay"/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:     <timer name="rtc" tickpolicy="catchup"/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:     <timer name="hpet" present="no"/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:   </clock>
Sep 30 07:50:57 compute-0 nova_compute[189265]:   <cpu mode="host-model" match="exact">
Sep 30 07:50:57 compute-0 nova_compute[189265]:     <topology sockets="1" cores="1" threads="1"/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:   </cpu>
Sep 30 07:50:57 compute-0 nova_compute[189265]:   <devices>
Sep 30 07:50:57 compute-0 nova_compute[189265]:     <disk type="file" device="disk">
Sep 30 07:50:57 compute-0 nova_compute[189265]:       <driver name="qemu" type="qcow2" cache="none"/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:       <source file="/var/lib/nova/instances/81f3990d-a6c1-43db-aa50-6b771758b8fb/disk"/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:       <target dev="vda" bus="virtio"/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:     </disk>
Sep 30 07:50:57 compute-0 nova_compute[189265]:     <disk type="file" device="cdrom">
Sep 30 07:50:57 compute-0 nova_compute[189265]:       <driver name="qemu" type="raw" cache="none"/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:       <source file="/var/lib/nova/instances/81f3990d-a6c1-43db-aa50-6b771758b8fb/disk.config"/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:       <target dev="sda" bus="sata"/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:     </disk>
Sep 30 07:50:57 compute-0 nova_compute[189265]:     <interface type="ethernet">
Sep 30 07:50:57 compute-0 nova_compute[189265]:       <mac address="fa:16:3e:78:e7:1d"/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:       <model type="virtio"/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:       <driver name="vhost" rx_queue_size="512"/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:       <mtu size="1442"/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:       <target dev="tap76fa05ca-c2"/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:     </interface>
Sep 30 07:50:57 compute-0 nova_compute[189265]:     <serial type="pty">
Sep 30 07:50:57 compute-0 nova_compute[189265]:       <log file="/var/lib/nova/instances/81f3990d-a6c1-43db-aa50-6b771758b8fb/console.log" append="off"/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:     </serial>
Sep 30 07:50:57 compute-0 nova_compute[189265]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:     <video>
Sep 30 07:50:57 compute-0 nova_compute[189265]:       <model type="virtio"/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:     </video>
Sep 30 07:50:57 compute-0 nova_compute[189265]:     <input type="tablet" bus="usb"/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:     <rng model="virtio">
Sep 30 07:50:57 compute-0 nova_compute[189265]:       <backend model="random">/dev/urandom</backend>
Sep 30 07:50:57 compute-0 nova_compute[189265]:     </rng>
Sep 30 07:50:57 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root"/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:     <controller type="pci" model="pcie-root-port"/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:     <controller type="usb" index="0"/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:     <memballoon model="virtio" autodeflate="on" freePageReporting="on">
Sep 30 07:50:57 compute-0 nova_compute[189265]:       <stats period="10"/>
Sep 30 07:50:57 compute-0 nova_compute[189265]:     </memballoon>
Sep 30 07:50:57 compute-0 nova_compute[189265]:   </devices>
Sep 30 07:50:57 compute-0 nova_compute[189265]: </domain>
Sep 30 07:50:57 compute-0 nova_compute[189265]:  _get_guest_xml /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:8052
Sep 30 07:50:57 compute-0 nova_compute[189265]: 2025-09-30 07:50:57.502 2 DEBUG nova.compute.manager [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Preparing to wait for external event network-vif-plugged-76fa05ca-c22d-48b8-b82e-ed72d1971c4c prepare_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:307
Sep 30 07:50:57 compute-0 nova_compute[189265]: 2025-09-30 07:50:57.503 2 DEBUG oslo_concurrency.lockutils [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Acquiring lock "81f3990d-a6c1-43db-aa50-6b771758b8fb-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:50:57 compute-0 nova_compute[189265]: 2025-09-30 07:50:57.503 2 DEBUG oslo_concurrency.lockutils [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Lock "81f3990d-a6c1-43db-aa50-6b771758b8fb-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:50:57 compute-0 nova_compute[189265]: 2025-09-30 07:50:57.504 2 DEBUG oslo_concurrency.lockutils [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Lock "81f3990d-a6c1-43db-aa50-6b771758b8fb-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:50:57 compute-0 nova_compute[189265]: 2025-09-30 07:50:57.505 2 DEBUG nova.virt.libvirt.vif [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='',created_at=2025-09-30T07:50:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-935657189',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-935657189',id=35,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2d470809703a44e69c2bc0d283b2bce4',ramdisk_id='',reservation_id='r-jwcbn16t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-642893642',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-642893642-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T07:50:50Z,user_data=None,user_id='f267aaead8a3437a8359b21224982b1c',uuid=81f3990d-a6c1-43db-aa50-6b771758b8fb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "76fa05ca-c22d-48b8-b82e-ed72d1971c4c", "address": "fa:16:3e:78:e7:1d", "network": {"id": "fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1009939787-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3bc30bc6516c4e49aed5726171c74d6f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap76fa05ca-c2", "ovs_interfaceid": "76fa05ca-c22d-48b8-b82e-ed72d1971c4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 07:50:57 compute-0 nova_compute[189265]: 2025-09-30 07:50:57.505 2 DEBUG nova.network.os_vif_util [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Converting VIF {"id": "76fa05ca-c22d-48b8-b82e-ed72d1971c4c", "address": "fa:16:3e:78:e7:1d", "network": {"id": "fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1009939787-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3bc30bc6516c4e49aed5726171c74d6f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap76fa05ca-c2", "ovs_interfaceid": "76fa05ca-c22d-48b8-b82e-ed72d1971c4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:50:57 compute-0 nova_compute[189265]: 2025-09-30 07:50:57.506 2 DEBUG nova.network.os_vif_util [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:78:e7:1d,bridge_name='br-int',has_traffic_filtering=True,id=76fa05ca-c22d-48b8-b82e-ed72d1971c4c,network=Network(fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap76fa05ca-c2') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:50:57 compute-0 nova_compute[189265]: 2025-09-30 07:50:57.507 2 DEBUG os_vif [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:78:e7:1d,bridge_name='br-int',has_traffic_filtering=True,id=76fa05ca-c22d-48b8-b82e-ed72d1971c4c,network=Network(fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap76fa05ca-c2') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 07:50:57 compute-0 nova_compute[189265]: 2025-09-30 07:50:57.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:50:57 compute-0 nova_compute[189265]: 2025-09-30 07:50:57.508 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:50:57 compute-0 nova_compute[189265]: 2025-09-30 07:50:57.509 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:50:57 compute-0 nova_compute[189265]: 2025-09-30 07:50:57.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:50:57 compute-0 nova_compute[189265]: 2025-09-30 07:50:57.510 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': 'f41b6ef2-bd92-5565-8515-ff09144ec809', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:50:57 compute-0 nova_compute[189265]: 2025-09-30 07:50:57.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:50:57 compute-0 nova_compute[189265]: 2025-09-30 07:50:57.513 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:50:57 compute-0 nova_compute[189265]: 2025-09-30 07:50:57.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:50:57 compute-0 nova_compute[189265]: 2025-09-30 07:50:57.517 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap76fa05ca-c2, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:50:57 compute-0 nova_compute[189265]: 2025-09-30 07:50:57.518 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tap76fa05ca-c2, col_values=(('qos', UUID('697b96ce-ba1d-4a0e-87b7-482421308bcd')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:50:57 compute-0 nova_compute[189265]: 2025-09-30 07:50:57.519 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tap76fa05ca-c2, col_values=(('external_ids', {'iface-id': '76fa05ca-c22d-48b8-b82e-ed72d1971c4c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:78:e7:1d', 'vm-uuid': '81f3990d-a6c1-43db-aa50-6b771758b8fb'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:50:57 compute-0 NetworkManager[51813]: <info>  [1759218657.5221] manager: (tap76fa05ca-c2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/101)
Sep 30 07:50:57 compute-0 nova_compute[189265]: 2025-09-30 07:50:57.523 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:50:57 compute-0 nova_compute[189265]: 2025-09-30 07:50:57.527 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:50:57 compute-0 nova_compute[189265]: 2025-09-30 07:50:57.528 2 INFO os_vif [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:78:e7:1d,bridge_name='br-int',has_traffic_filtering=True,id=76fa05ca-c22d-48b8-b82e-ed72d1971c4c,network=Network(fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap76fa05ca-c2')
Sep 30 07:50:59 compute-0 nova_compute[189265]: 2025-09-30 07:50:59.069 2 DEBUG nova.virt.libvirt.driver [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 07:50:59 compute-0 nova_compute[189265]: 2025-09-30 07:50:59.069 2 DEBUG nova.virt.libvirt.driver [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:13022
Sep 30 07:50:59 compute-0 nova_compute[189265]: 2025-09-30 07:50:59.070 2 DEBUG nova.virt.libvirt.driver [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] No VIF found with MAC fa:16:3e:78:e7:1d, not building metadata _build_interface_metadata /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:12998
Sep 30 07:50:59 compute-0 nova_compute[189265]: 2025-09-30 07:50:59.070 2 INFO nova.virt.libvirt.driver [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Using config drive
Sep 30 07:50:59 compute-0 nova_compute[189265]: 2025-09-30 07:50:59.583 2 WARNING neutronclient.v2_0.client [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:50:59 compute-0 podman[199733]: time="2025-09-30T07:50:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:50:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:50:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:50:59 compute-0 nova_compute[189265]: 2025-09-30 07:50:59.752 2 INFO nova.virt.libvirt.driver [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Creating config drive at /var/lib/nova/instances/81f3990d-a6c1-43db-aa50-6b771758b8fb/disk.config
Sep 30 07:50:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:50:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3018 "" "Go-http-client/1.1"
Sep 30 07:50:59 compute-0 nova_compute[189265]: 2025-09-30 07:50:59.758 2 DEBUG oslo_concurrency.processutils [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/81f3990d-a6c1-43db-aa50-6b771758b8fb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpzi16_ial execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:50:59 compute-0 nova_compute[189265]: 2025-09-30 07:50:59.899 2 DEBUG oslo_concurrency.processutils [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/81f3990d-a6c1-43db-aa50-6b771758b8fb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 32.1.0-0.20250919142712.b99a882.el10 -quiet -J -r -V config-2 /tmp/tmpzi16_ial" returned: 0 in 0.141s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:50:59 compute-0 kernel: tap76fa05ca-c2: entered promiscuous mode
Sep 30 07:50:59 compute-0 NetworkManager[51813]: <info>  [1759218659.9896] manager: (tap76fa05ca-c2): new Tun device (/org/freedesktop/NetworkManager/Devices/102)
Sep 30 07:50:59 compute-0 ovn_controller[91436]: 2025-09-30T07:50:59Z|00293|binding|INFO|Claiming lport 76fa05ca-c22d-48b8-b82e-ed72d1971c4c for this chassis.
Sep 30 07:50:59 compute-0 ovn_controller[91436]: 2025-09-30T07:50:59Z|00294|binding|INFO|76fa05ca-c22d-48b8-b82e-ed72d1971c4c: Claiming fa:16:3e:78:e7:1d 10.100.0.14
Sep 30 07:50:59 compute-0 nova_compute[189265]: 2025-09-30 07:50:59.993 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:50:59 compute-0 nova_compute[189265]: 2025-09-30 07:50:59.997 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:00 compute-0 nova_compute[189265]: 2025-09-30 07:51:00.006 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:00 compute-0 nova_compute[189265]: 2025-09-30 07:51:00.012 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:00.022 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:78:e7:1d 10.100.0.14'], port_security=['fa:16:3e:78:e7:1d 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '81f3990d-a6c1-43db-aa50-6b771758b8fb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2d470809703a44e69c2bc0d283b2bce4', 'neutron:revision_number': '4', 'neutron:security_group_ids': '33f74b46-9f43-491b-8b86-3db30861b2d8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a9ce9f20-a5bb-4f10-94ac-caac3fb7e1ad, chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], logical_port=76fa05ca-c22d-48b8-b82e-ed72d1971c4c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:00.025 100322 INFO neutron.agent.ovn.metadata.agent [-] Port 76fa05ca-c22d-48b8-b82e-ed72d1971c4c in datapath fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f bound to our chassis
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:00.026 100322 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f
Sep 30 07:51:00 compute-0 systemd-udevd[226773]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:00.047 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[61b4fe60-d14e-43f4-94f3-17830746fc59]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:00.048 100322 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfe7b13d3-f1 in ovnmeta-fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f namespace provision_datapath /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:794
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:00.051 210650 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfe7b13d3-f0 not found in namespace None get_link_id /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:203
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:00.051 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[b0e3b8f2-05a6-4084-9582-047ab8bff5da]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:00.052 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[1cc88d43-001e-4ec6-afd3-2be2d5f3e1f8]: (4, False) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:51:00 compute-0 systemd-machined[149233]: New machine qemu-25-instance-00000023.
Sep 30 07:51:00 compute-0 sshd-session[226675]: error: kex_exchange_identification: read: Connection timed out
Sep 30 07:51:00 compute-0 sshd-session[226675]: banner exchange: Connection from 120.48.85.137 port 46256: Connection timed out
Sep 30 07:51:00 compute-0 NetworkManager[51813]: <info>  [1759218660.0773] device (tap76fa05ca-c2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 07:51:00 compute-0 NetworkManager[51813]: <info>  [1759218660.0782] device (tap76fa05ca-c2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:00.080 100440 DEBUG oslo.privsep.daemon [-] privsep: reply[a5cfcbba-302b-49f2-be95-7629ac29dedc]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:51:00 compute-0 systemd[1]: Started Virtual Machine qemu-25-instance-00000023.
Sep 30 07:51:00 compute-0 ovn_controller[91436]: 2025-09-30T07:51:00Z|00295|binding|INFO|Setting lport 76fa05ca-c22d-48b8-b82e-ed72d1971c4c ovn-installed in OVS
Sep 30 07:51:00 compute-0 ovn_controller[91436]: 2025-09-30T07:51:00Z|00296|binding|INFO|Setting lport 76fa05ca-c22d-48b8-b82e-ed72d1971c4c up in Southbound
Sep 30 07:51:00 compute-0 nova_compute[189265]: 2025-09-30 07:51:00.094 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:00.101 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[b96149aa-1781-46a4-b3ea-4199e437e355]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:00.140 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[e3df3e6a-0b71-44c8-9fd6-82d7747b7eec]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:00.146 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[82a93665-ea7f-43d9-aebe-4f0b54245cef]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:51:00 compute-0 NetworkManager[51813]: <info>  [1759218660.1480] manager: (tapfe7b13d3-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/103)
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:00.207 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[b0d4246c-4d6c-440d-8c41-dbccac2b281f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:00.213 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[7f1632f0-d8b4-450b-9292-25705d488a74]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:51:00 compute-0 NetworkManager[51813]: <info>  [1759218660.2477] device (tapfe7b13d3-f0): carrier: link connected
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:00.258 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[4b3d541c-c54c-41a9-8166-da84209fb78f]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:51:00 compute-0 nova_compute[189265]: 2025-09-30 07:51:00.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:00.281 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[60f9fb80-fce9-4762-a9e3-ab399a733002]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfe7b13d3-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bf:21:3a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 75], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 651796, 'reachable_time': 31867, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226806, 'error': None, 'target': 'ovnmeta-fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:00.305 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[0eddbb05-1f6c-4e49-9c10-c1a0a79746ef]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febf:213a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 651796, 'tstamp': 651796}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226808, 'error': None, 'target': 'ovnmeta-fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:00.332 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[919faf40-6f4a-4e22-82ba-9b736d4f6c03]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfe7b13d3-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bf:21:3a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 75], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 651796, 'reachable_time': 31867, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226809, 'error': None, 'target': 'ovnmeta-fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:00.381 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[a1a467bd-5362-49c4-ab24-37334bcc01ab]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:00.486 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[dcf044f0-f546-4972-8c80-680ec1608347]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:00.490 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfe7b13d3-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:00.490 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:00.491 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfe7b13d3-f0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:51:00 compute-0 nova_compute[189265]: 2025-09-30 07:51:00.539 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:00 compute-0 kernel: tapfe7b13d3-f0: entered promiscuous mode
Sep 30 07:51:00 compute-0 NetworkManager[51813]: <info>  [1759218660.5394] manager: (tapfe7b13d3-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/104)
Sep 30 07:51:00 compute-0 nova_compute[189265]: 2025-09-30 07:51:00.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:00.542 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfe7b13d3-f0, col_values=(('external_ids', {'iface-id': 'dcaa81b8-98f0-4a50-8978-177847c9169e'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:51:00 compute-0 nova_compute[189265]: 2025-09-30 07:51:00.544 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:00 compute-0 ovn_controller[91436]: 2025-09-30T07:51:00Z|00297|binding|INFO|Releasing lport dcaa81b8-98f0-4a50-8978-177847c9169e from this chassis (sb_readonly=0)
Sep 30 07:51:00 compute-0 nova_compute[189265]: 2025-09-30 07:51:00.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:00.562 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[c6308ea8-d94f-431b-8c22-b765b9e2c250]: (4, '') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:00.563 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:00.563 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:00.563 100322 DEBUG neutron.agent.linux.external_process [-] No haproxy process started for fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f disable /usr/lib/python3.12/site-packages/neutron/agent/linux/external_process.py:173
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:00.563 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:00.564 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[d411836a-7360-4fd9-87a0-af17a71ffc0c]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:00.564 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:00.564 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[34eceed7-d493-4e60-9142-6aba7a2346c7]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:00.565 100322 DEBUG neutron.agent.metadata.driver_base [-] haproxy_cfg = 
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]: global
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]:     log         /dev/log local0 debug
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]:     log-tag     haproxy-metadata-proxy-fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]:     user        root
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]:     group       root
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]:     maxconn     1024
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]:     pidfile     /var/lib/neutron/external/pids/fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f.pid.haproxy
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]:     daemon
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]: 
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]: defaults
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]:     log global
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]:     mode http
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]:     option httplog
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]:     option dontlognull
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]:     option http-server-close
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]:     option forwardfor
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]:     retries                 3
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]:     timeout http-request    30s
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]:     timeout connect         30s
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]:     timeout client          32s
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]:     timeout server          32s
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]:     timeout http-keep-alive 30s
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]: 
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]: listen listener
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]:     bind 169.254.169.254:80
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]:     
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]:     server metadata /var/lib/neutron/metadata_proxy
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]: 
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]:     http-request add-header X-OVN-Network-ID fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]:  create_config_file /usr/lib/python3.12/site-packages/neutron/agent/metadata/driver_base.py:155
Sep 30 07:51:00 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:00.565 100322 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f', 'env', 'PROCESS_TAG=haproxy-fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f.conf'] create_process /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:84
Sep 30 07:51:00 compute-0 nova_compute[189265]: 2025-09-30 07:51:00.741 2 DEBUG nova.compute.manager [req-25dcda6b-15e4-4883-bc11-e4d63a596f18 req-efda5791-f8ab-48e0-9682-f13c3099cdc5 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Received event network-vif-plugged-76fa05ca-c22d-48b8-b82e-ed72d1971c4c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:51:00 compute-0 nova_compute[189265]: 2025-09-30 07:51:00.748 2 DEBUG oslo_concurrency.lockutils [req-25dcda6b-15e4-4883-bc11-e4d63a596f18 req-efda5791-f8ab-48e0-9682-f13c3099cdc5 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "81f3990d-a6c1-43db-aa50-6b771758b8fb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:51:00 compute-0 nova_compute[189265]: 2025-09-30 07:51:00.748 2 DEBUG oslo_concurrency.lockutils [req-25dcda6b-15e4-4883-bc11-e4d63a596f18 req-efda5791-f8ab-48e0-9682-f13c3099cdc5 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "81f3990d-a6c1-43db-aa50-6b771758b8fb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:51:00 compute-0 nova_compute[189265]: 2025-09-30 07:51:00.749 2 DEBUG oslo_concurrency.lockutils [req-25dcda6b-15e4-4883-bc11-e4d63a596f18 req-efda5791-f8ab-48e0-9682-f13c3099cdc5 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "81f3990d-a6c1-43db-aa50-6b771758b8fb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:51:00 compute-0 nova_compute[189265]: 2025-09-30 07:51:00.749 2 DEBUG nova.compute.manager [req-25dcda6b-15e4-4883-bc11-e4d63a596f18 req-efda5791-f8ab-48e0-9682-f13c3099cdc5 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Processing event network-vif-plugged-76fa05ca-c22d-48b8-b82e-ed72d1971c4c _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11572
Sep 30 07:51:00 compute-0 nova_compute[189265]: 2025-09-30 07:51:00.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:51:00 compute-0 nova_compute[189265]: 2025-09-30 07:51:00.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:51:00 compute-0 nova_compute[189265]: 2025-09-30 07:51:00.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:51:00 compute-0 podman[226845]: 2025-09-30 07:51:00.970496244 +0000 UTC m=+0.065434248 container create b7edb61101377cdf61549f13f36e105a5dfe6ab712c01101ed065bbc5cb8b2c2 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Sep 30 07:51:01 compute-0 systemd[1]: Started libpod-conmon-b7edb61101377cdf61549f13f36e105a5dfe6ab712c01101ed065bbc5cb8b2c2.scope.
Sep 30 07:51:01 compute-0 podman[226845]: 2025-09-30 07:51:00.936451772 +0000 UTC m=+0.031389876 image pull eeebcc09bc72f81ab45f5ab87eb8f6a7b554b949227aeec082bdb0732754ddc8 38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest
Sep 30 07:51:01 compute-0 systemd[1]: Started libcrun container.
Sep 30 07:51:01 compute-0 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/976a317208e8e60d58c0cbef569bf75e88364e311dd8e1bc03a80c9a74d0bc8c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Sep 30 07:51:01 compute-0 podman[226845]: 2025-09-30 07:51:01.079241311 +0000 UTC m=+0.174179305 container init b7edb61101377cdf61549f13f36e105a5dfe6ab712c01101ed065bbc5cb8b2c2 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 07:51:01 compute-0 podman[226845]: 2025-09-30 07:51:01.088285081 +0000 UTC m=+0.183223075 container start b7edb61101377cdf61549f13f36e105a5dfe6ab712c01101ed065bbc5cb8b2c2 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4)
Sep 30 07:51:01 compute-0 neutron-haproxy-ovnmeta-fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f[226862]: [NOTICE]   (226866) : New worker (226868) forked
Sep 30 07:51:01 compute-0 neutron-haproxy-ovnmeta-fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f[226862]: [NOTICE]   (226866) : Loading success.
Sep 30 07:51:01 compute-0 nova_compute[189265]: 2025-09-30 07:51:01.404 2 DEBUG nova.compute.manager [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:602
Sep 30 07:51:01 compute-0 nova_compute[189265]: 2025-09-30 07:51:01.409 2 DEBUG nova.virt.libvirt.driver [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Guest created on hypervisor spawn /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:4816
Sep 30 07:51:01 compute-0 openstack_network_exporter[201859]: ERROR   07:51:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:51:01 compute-0 openstack_network_exporter[201859]: ERROR   07:51:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:51:01 compute-0 openstack_network_exporter[201859]: ERROR   07:51:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:51:01 compute-0 openstack_network_exporter[201859]: ERROR   07:51:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:51:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:51:01 compute-0 nova_compute[189265]: 2025-09-30 07:51:01.414 2 INFO nova.virt.libvirt.driver [-] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Instance spawned successfully.
Sep 30 07:51:01 compute-0 openstack_network_exporter[201859]: ERROR   07:51:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:51:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:51:01 compute-0 nova_compute[189265]: 2025-09-30 07:51:01.414 2 DEBUG nova.virt.libvirt.driver [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:985
Sep 30 07:51:01 compute-0 nova_compute[189265]: 2025-09-30 07:51:01.935 2 DEBUG nova.virt.libvirt.driver [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:51:01 compute-0 nova_compute[189265]: 2025-09-30 07:51:01.935 2 DEBUG nova.virt.libvirt.driver [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:51:01 compute-0 nova_compute[189265]: 2025-09-30 07:51:01.936 2 DEBUG nova.virt.libvirt.driver [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:51:01 compute-0 nova_compute[189265]: 2025-09-30 07:51:01.937 2 DEBUG nova.virt.libvirt.driver [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:51:01 compute-0 nova_compute[189265]: 2025-09-30 07:51:01.937 2 DEBUG nova.virt.libvirt.driver [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:51:01 compute-0 nova_compute[189265]: 2025-09-30 07:51:01.938 2 DEBUG nova.virt.libvirt.driver [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:1014
Sep 30 07:51:02 compute-0 nova_compute[189265]: 2025-09-30 07:51:02.450 2 INFO nova.compute.manager [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Took 10.86 seconds to spawn the instance on the hypervisor.
Sep 30 07:51:02 compute-0 nova_compute[189265]: 2025-09-30 07:51:02.451 2 DEBUG nova.compute.manager [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 07:51:02 compute-0 nova_compute[189265]: 2025-09-30 07:51:02.521 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:02 compute-0 nova_compute[189265]: 2025-09-30 07:51:02.826 2 DEBUG nova.compute.manager [req-2da4090a-930e-4d97-a6c1-3967f0a21aa6 req-f14d9579-d5c2-46d4-a177-1eb9fe419926 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Received event network-vif-plugged-76fa05ca-c22d-48b8-b82e-ed72d1971c4c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:51:02 compute-0 nova_compute[189265]: 2025-09-30 07:51:02.827 2 DEBUG oslo_concurrency.lockutils [req-2da4090a-930e-4d97-a6c1-3967f0a21aa6 req-f14d9579-d5c2-46d4-a177-1eb9fe419926 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "81f3990d-a6c1-43db-aa50-6b771758b8fb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:51:02 compute-0 nova_compute[189265]: 2025-09-30 07:51:02.829 2 DEBUG oslo_concurrency.lockutils [req-2da4090a-930e-4d97-a6c1-3967f0a21aa6 req-f14d9579-d5c2-46d4-a177-1eb9fe419926 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "81f3990d-a6c1-43db-aa50-6b771758b8fb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:51:02 compute-0 nova_compute[189265]: 2025-09-30 07:51:02.829 2 DEBUG oslo_concurrency.lockutils [req-2da4090a-930e-4d97-a6c1-3967f0a21aa6 req-f14d9579-d5c2-46d4-a177-1eb9fe419926 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "81f3990d-a6c1-43db-aa50-6b771758b8fb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:51:02 compute-0 nova_compute[189265]: 2025-09-30 07:51:02.830 2 DEBUG nova.compute.manager [req-2da4090a-930e-4d97-a6c1-3967f0a21aa6 req-f14d9579-d5c2-46d4-a177-1eb9fe419926 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] No waiting events found dispatching network-vif-plugged-76fa05ca-c22d-48b8-b82e-ed72d1971c4c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:51:02 compute-0 nova_compute[189265]: 2025-09-30 07:51:02.830 2 WARNING nova.compute.manager [req-2da4090a-930e-4d97-a6c1-3967f0a21aa6 req-f14d9579-d5c2-46d4-a177-1eb9fe419926 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Received unexpected event network-vif-plugged-76fa05ca-c22d-48b8-b82e-ed72d1971c4c for instance with vm_state active and task_state None.
Sep 30 07:51:02 compute-0 nova_compute[189265]: 2025-09-30 07:51:02.992 2 INFO nova.compute.manager [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Took 16.08 seconds to build instance.
Sep 30 07:51:03 compute-0 nova_compute[189265]: 2025-09-30 07:51:03.498 2 DEBUG oslo_concurrency.lockutils [None req-e2f5285b-8454-477d-b244-76be8ebf28fd f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Lock "81f3990d-a6c1-43db-aa50-6b771758b8fb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.608s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:51:04 compute-0 podman[226877]: 2025-09-30 07:51:04.518555624 +0000 UTC m=+0.091582793 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 07:51:05 compute-0 nova_compute[189265]: 2025-09-30 07:51:05.263 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:06 compute-0 nova_compute[189265]: 2025-09-30 07:51:06.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:51:06 compute-0 nova_compute[189265]: 2025-09-30 07:51:06.788 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:51:07 compute-0 nova_compute[189265]: 2025-09-30 07:51:07.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:09 compute-0 sshd-session[226754]: error: kex_exchange_identification: read: Connection timed out
Sep 30 07:51:09 compute-0 sshd-session[226754]: banner exchange: Connection from 14.103.71.220 port 36228: Connection timed out
Sep 30 07:51:09 compute-0 nova_compute[189265]: 2025-09-30 07:51:09.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:51:09 compute-0 nova_compute[189265]: 2025-09-30 07:51:09.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:51:10 compute-0 nova_compute[189265]: 2025-09-30 07:51:10.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:10 compute-0 nova_compute[189265]: 2025-09-30 07:51:10.304 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:51:10 compute-0 nova_compute[189265]: 2025-09-30 07:51:10.304 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:51:10 compute-0 nova_compute[189265]: 2025-09-30 07:51:10.304 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:51:10 compute-0 nova_compute[189265]: 2025-09-30 07:51:10.305 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:51:11 compute-0 nova_compute[189265]: 2025-09-30 07:51:11.346 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/81f3990d-a6c1-43db-aa50-6b771758b8fb/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:51:11 compute-0 nova_compute[189265]: 2025-09-30 07:51:11.428 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/81f3990d-a6c1-43db-aa50-6b771758b8fb/disk --force-share --output=json" returned: 0 in 0.082s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:51:11 compute-0 nova_compute[189265]: 2025-09-30 07:51:11.430 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/81f3990d-a6c1-43db-aa50-6b771758b8fb/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:51:11 compute-0 nova_compute[189265]: 2025-09-30 07:51:11.480 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/81f3990d-a6c1-43db-aa50-6b771758b8fb/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:51:11 compute-0 nova_compute[189265]: 2025-09-30 07:51:11.656 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:51:11 compute-0 nova_compute[189265]: 2025-09-30 07:51:11.658 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:51:11 compute-0 nova_compute[189265]: 2025-09-30 07:51:11.679 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.021s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:51:11 compute-0 nova_compute[189265]: 2025-09-30 07:51:11.680 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5684MB free_disk=73.30248641967773GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:51:11 compute-0 nova_compute[189265]: 2025-09-30 07:51:11.680 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:51:11 compute-0 nova_compute[189265]: 2025-09-30 07:51:11.681 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:51:12 compute-0 nova_compute[189265]: 2025-09-30 07:51:12.564 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:12 compute-0 nova_compute[189265]: 2025-09-30 07:51:12.737 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Instance 81f3990d-a6c1-43db-aa50-6b771758b8fb actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1740
Sep 30 07:51:12 compute-0 nova_compute[189265]: 2025-09-30 07:51:12.737 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:51:12 compute-0 nova_compute[189265]: 2025-09-30 07:51:12.737 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:51:11 up  1:48,  0 user,  load average: 0.18, 0.20, 0.25\n', 'num_instances': '1', 'num_vm_active': '1', 'num_task_None': '1', 'num_os_type_None': '1', 'num_proj_2d470809703a44e69c2bc0d283b2bce4': '1', 'io_workload': '0'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:51:12 compute-0 nova_compute[189265]: 2025-09-30 07:51:12.771 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:51:13 compute-0 nova_compute[189265]: 2025-09-30 07:51:13.278 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:51:13 compute-0 ovn_controller[91436]: 2025-09-30T07:51:13Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:78:e7:1d 10.100.0.14
Sep 30 07:51:13 compute-0 ovn_controller[91436]: 2025-09-30T07:51:13Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:78:e7:1d 10.100.0.14
Sep 30 07:51:13 compute-0 nova_compute[189265]: 2025-09-30 07:51:13.790 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:51:13 compute-0 nova_compute[189265]: 2025-09-30 07:51:13.791 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.110s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:51:14 compute-0 podman[226920]: 2025-09-30 07:51:14.500263995 +0000 UTC m=+0.080172932 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Sep 30 07:51:14 compute-0 nova_compute[189265]: 2025-09-30 07:51:14.791 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:51:15 compute-0 nova_compute[189265]: 2025-09-30 07:51:15.265 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:17 compute-0 nova_compute[189265]: 2025-09-30 07:51:17.604 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:18 compute-0 nova_compute[189265]: 2025-09-30 07:51:18.847 2 DEBUG nova.virt.libvirt.driver [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 58023b4a-dc78-4f0b-a216-57b512b9561c] Creating tmpfile /var/lib/nova/instances/tmpffdz8q9d to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:10944
Sep 30 07:51:18 compute-0 nova_compute[189265]: 2025-09-30 07:51:18.848 2 WARNING neutronclient.v2_0.client [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:51:18 compute-0 nova_compute[189265]: 2025-09-30 07:51:18.853 2 DEBUG nova.compute.manager [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpffdz8q9d',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst=<?>,serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.12/site-packages/nova/compute/manager.py:9086
Sep 30 07:51:19 compute-0 podman[226941]: 2025-09-30 07:51:19.517453296 +0000 UTC m=+0.086637619 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vcs-type=git, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, release=1755695350, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.expose-services=, architecture=x86_64)
Sep 30 07:51:20 compute-0 nova_compute[189265]: 2025-09-30 07:51:20.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:20.600 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:51:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:20.601 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:51:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:20.602 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:51:20 compute-0 nova_compute[189265]: 2025-09-30 07:51:20.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:51:20 compute-0 nova_compute[189265]: 2025-09-30 07:51:20.891 2 WARNING neutronclient.v2_0.client [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:51:21 compute-0 unix_chkpwd[226965]: password check failed for user (root)
Sep 30 07:51:21 compute-0 sshd-session[226963]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.119  user=root
Sep 30 07:51:22 compute-0 podman[226967]: 2025-09-30 07:51:22.53482445 +0000 UTC m=+0.100878271 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 07:51:22 compute-0 podman[226966]: 2025-09-30 07:51:22.544123918 +0000 UTC m=+0.114842983 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Sep 30 07:51:22 compute-0 podman[226968]: 2025-09-30 07:51:22.570992833 +0000 UTC m=+0.132587305 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, org.label-schema.license=GPLv2)
Sep 30 07:51:22 compute-0 nova_compute[189265]: 2025-09-30 07:51:22.606 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:23 compute-0 sshd-session[226963]: Failed password for root from 80.94.93.119 port 45352 ssh2
Sep 30 07:51:23 compute-0 unix_chkpwd[227030]: password check failed for user (root)
Sep 30 07:51:25 compute-0 nova_compute[189265]: 2025-09-30 07:51:25.273 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:25 compute-0 nova_compute[189265]: 2025-09-30 07:51:25.512 2 DEBUG nova.compute.manager [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpffdz8q9d',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='58023b4a-dc78-4f0b-a216-57b512b9561c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9311
Sep 30 07:51:26 compute-0 sshd-session[226963]: Failed password for root from 80.94.93.119 port 45352 ssh2
Sep 30 07:51:26 compute-0 nova_compute[189265]: 2025-09-30 07:51:26.527 2 DEBUG oslo_concurrency.lockutils [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "refresh_cache-58023b4a-dc78-4f0b-a216-57b512b9561c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:51:26 compute-0 nova_compute[189265]: 2025-09-30 07:51:26.527 2 DEBUG oslo_concurrency.lockutils [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquired lock "refresh_cache-58023b4a-dc78-4f0b-a216-57b512b9561c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:51:26 compute-0 nova_compute[189265]: 2025-09-30 07:51:26.528 2 DEBUG nova.network.neutron [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 58023b4a-dc78-4f0b-a216-57b512b9561c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 07:51:27 compute-0 nova_compute[189265]: 2025-09-30 07:51:27.035 2 WARNING neutronclient.v2_0.client [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:51:27 compute-0 nova_compute[189265]: 2025-09-30 07:51:27.652 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:27 compute-0 unix_chkpwd[227031]: password check failed for user (root)
Sep 30 07:51:28 compute-0 nova_compute[189265]: 2025-09-30 07:51:28.525 2 WARNING neutronclient.v2_0.client [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:51:28 compute-0 nova_compute[189265]: 2025-09-30 07:51:28.773 2 DEBUG nova.network.neutron [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 58023b4a-dc78-4f0b-a216-57b512b9561c] Updating instance_info_cache with network_info: [{"id": "e73e78a1-01c0-43c0-b191-97cbd6fd06aa", "address": "fa:16:3e:46:c1:55", "network": {"id": "fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1009939787-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3bc30bc6516c4e49aed5726171c74d6f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape73e78a1-01", "ovs_interfaceid": "e73e78a1-01c0-43c0-b191-97cbd6fd06aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:51:29 compute-0 nova_compute[189265]: 2025-09-30 07:51:29.280 2 DEBUG oslo_concurrency.lockutils [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Releasing lock "refresh_cache-58023b4a-dc78-4f0b-a216-57b512b9561c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:51:29 compute-0 nova_compute[189265]: 2025-09-30 07:51:29.300 2 DEBUG nova.virt.libvirt.driver [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 58023b4a-dc78-4f0b-a216-57b512b9561c] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpffdz8q9d',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='58023b4a-dc78-4f0b-a216-57b512b9561c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=<?>,source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11737
Sep 30 07:51:29 compute-0 nova_compute[189265]: 2025-09-30 07:51:29.301 2 DEBUG nova.virt.libvirt.driver [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 58023b4a-dc78-4f0b-a216-57b512b9561c] Creating instance directory: /var/lib/nova/instances/58023b4a-dc78-4f0b-a216-57b512b9561c pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11750
Sep 30 07:51:29 compute-0 nova_compute[189265]: 2025-09-30 07:51:29.301 2 DEBUG nova.virt.libvirt.driver [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 58023b4a-dc78-4f0b-a216-57b512b9561c] Creating disk.info with the contents: {'/var/lib/nova/instances/58023b4a-dc78-4f0b-a216-57b512b9561c/disk': 'qcow2', '/var/lib/nova/instances/58023b4a-dc78-4f0b-a216-57b512b9561c/disk.config': 'raw'} pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11764
Sep 30 07:51:29 compute-0 nova_compute[189265]: 2025-09-30 07:51:29.302 2 DEBUG nova.virt.libvirt.driver [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 58023b4a-dc78-4f0b-a216-57b512b9561c] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11774
Sep 30 07:51:29 compute-0 nova_compute[189265]: 2025-09-30 07:51:29.303 2 DEBUG nova.objects.instance [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lazy-loading 'trusted_certs' on Instance uuid 58023b4a-dc78-4f0b-a216-57b512b9561c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:51:29 compute-0 podman[199733]: time="2025-09-30T07:51:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:51:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:51:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 20752 "" "Go-http-client/1.1"
Sep 30 07:51:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:51:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3478 "" "Go-http-client/1.1"
Sep 30 07:51:29 compute-0 nova_compute[189265]: 2025-09-30 07:51:29.810 2 DEBUG oslo_utils.imageutils.format_inspector [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:51:29 compute-0 nova_compute[189265]: 2025-09-30 07:51:29.814 2 DEBUG oslo_utils.imageutils.format_inspector [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:51:29 compute-0 nova_compute[189265]: 2025-09-30 07:51:29.816 2 DEBUG oslo_concurrency.processutils [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:51:29 compute-0 nova_compute[189265]: 2025-09-30 07:51:29.900 2 DEBUG oslo_concurrency.processutils [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:51:29 compute-0 nova_compute[189265]: 2025-09-30 07:51:29.900 2 DEBUG oslo_concurrency.lockutils [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "649c128805005f3dfb5a93843c58a367cdfe939d" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:51:29 compute-0 nova_compute[189265]: 2025-09-30 07:51:29.901 2 DEBUG oslo_concurrency.lockutils [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "649c128805005f3dfb5a93843c58a367cdfe939d" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:51:29 compute-0 nova_compute[189265]: 2025-09-30 07:51:29.902 2 DEBUG oslo_utils.imageutils.format_inspector [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Format inspector for vmdk does not match, excluding from consideration (Signature KDMV not found: b'\xebH\x90\x00') _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:51:29 compute-0 nova_compute[189265]: 2025-09-30 07:51:29.906 2 DEBUG oslo_utils.imageutils.format_inspector [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Format inspector for vhdx does not match, excluding from consideration (Region signature not found at 30000) _process_chunk /usr/lib/python3.12/site-packages/oslo_utils/imageutils/format_inspector.py:1365
Sep 30 07:51:29 compute-0 nova_compute[189265]: 2025-09-30 07:51:29.907 2 DEBUG oslo_concurrency.processutils [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:51:29 compute-0 sshd-session[226963]: Failed password for root from 80.94.93.119 port 45352 ssh2
Sep 30 07:51:29 compute-0 nova_compute[189265]: 2025-09-30 07:51:29.992 2 DEBUG oslo_concurrency.processutils [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:51:29 compute-0 nova_compute[189265]: 2025-09-30 07:51:29.994 2 DEBUG oslo_concurrency.processutils [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d,backing_fmt=raw /var/lib/nova/instances/58023b4a-dc78-4f0b-a216-57b512b9561c/disk 1073741824 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:51:30 compute-0 nova_compute[189265]: 2025-09-30 07:51:30.028 2 DEBUG oslo_concurrency.processutils [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d,backing_fmt=raw /var/lib/nova/instances/58023b4a-dc78-4f0b-a216-57b512b9561c/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:51:30 compute-0 nova_compute[189265]: 2025-09-30 07:51:30.030 2 DEBUG oslo_concurrency.lockutils [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "649c128805005f3dfb5a93843c58a367cdfe939d" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.129s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:51:30 compute-0 nova_compute[189265]: 2025-09-30 07:51:30.030 2 DEBUG oslo_concurrency.processutils [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:51:30 compute-0 sshd-session[226963]: Received disconnect from 80.94.93.119 port 45352:11:  [preauth]
Sep 30 07:51:30 compute-0 sshd-session[226963]: Disconnected from authenticating user root 80.94.93.119 port 45352 [preauth]
Sep 30 07:51:30 compute-0 sshd-session[226963]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.119  user=root
Sep 30 07:51:30 compute-0 nova_compute[189265]: 2025-09-30 07:51:30.095 2 DEBUG oslo_concurrency.processutils [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/649c128805005f3dfb5a93843c58a367cdfe939d --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:51:30 compute-0 nova_compute[189265]: 2025-09-30 07:51:30.096 2 DEBUG nova.virt.disk.api [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Checking if we can resize image /var/lib/nova/instances/58023b4a-dc78-4f0b-a216-57b512b9561c/disk. size=1073741824 can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:164
Sep 30 07:51:30 compute-0 nova_compute[189265]: 2025-09-30 07:51:30.096 2 DEBUG oslo_concurrency.processutils [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/58023b4a-dc78-4f0b-a216-57b512b9561c/disk --force-share --output=json execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:51:30 compute-0 nova_compute[189265]: 2025-09-30 07:51:30.150 2 DEBUG oslo_concurrency.processutils [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/58023b4a-dc78-4f0b-a216-57b512b9561c/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:51:30 compute-0 nova_compute[189265]: 2025-09-30 07:51:30.151 2 DEBUG nova.virt.disk.api [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Cannot resize image /var/lib/nova/instances/58023b4a-dc78-4f0b-a216-57b512b9561c/disk to a smaller size. can_resize_image /usr/lib/python3.12/site-packages/nova/virt/disk/api.py:170
Sep 30 07:51:30 compute-0 nova_compute[189265]: 2025-09-30 07:51:30.151 2 DEBUG nova.objects.instance [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lazy-loading 'migration_context' on Instance uuid 58023b4a-dc78-4f0b-a216-57b512b9561c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:51:30 compute-0 nova_compute[189265]: 2025-09-30 07:51:30.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:30 compute-0 ovn_controller[91436]: 2025-09-30T07:51:30Z|00298|memory_trim|INFO|Detected inactivity (last active 30009 ms ago): trimming memory
Sep 30 07:51:30 compute-0 nova_compute[189265]: 2025-09-30 07:51:30.660 2 DEBUG nova.objects.base [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Object Instance<58023b4a-dc78-4f0b-a216-57b512b9561c> lazy-loaded attributes: trusted_certs,migration_context wrapper /usr/lib/python3.12/site-packages/nova/objects/base.py:136
Sep 30 07:51:30 compute-0 nova_compute[189265]: 2025-09-30 07:51:30.660 2 DEBUG oslo_concurrency.processutils [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/58023b4a-dc78-4f0b-a216-57b512b9561c/disk.config 497664 execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:51:30 compute-0 nova_compute[189265]: 2025-09-30 07:51:30.686 2 DEBUG oslo_concurrency.processutils [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/58023b4a-dc78-4f0b-a216-57b512b9561c/disk.config 497664" returned: 0 in 0.026s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:51:30 compute-0 nova_compute[189265]: 2025-09-30 07:51:30.687 2 DEBUG nova.virt.libvirt.driver [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 58023b4a-dc78-4f0b-a216-57b512b9561c] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11704
Sep 30 07:51:30 compute-0 nova_compute[189265]: 2025-09-30 07:51:30.689 2 DEBUG nova.virt.libvirt.vif [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=1,config_drive='True',created_at=2025-09-30T07:50:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-981327066',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-981327066',id=34,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T07:50:38Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='2d470809703a44e69c2bc0d283b2bce4',ramdisk_id='',reservation_id='r-jrwm4zqc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-642893642',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-642893642-project-admin'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-09-30T07:50:38Z,user_data=None,user_id='f267aaead8a3437a8359b21224982b1c',uuid=58023b4a-dc78-4f0b-a216-57b512b9561c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e73e78a1-01c0-43c0-b191-97cbd6fd06aa", "address": "fa:16:3e:46:c1:55", "network": {"id": "fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1009939787-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3bc30bc6516c4e49aed5726171c74d6f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tape73e78a1-01", "ovs_interfaceid": "e73e78a1-01c0-43c0-b191-97cbd6fd06aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:721
Sep 30 07:51:30 compute-0 nova_compute[189265]: 2025-09-30 07:51:30.690 2 DEBUG nova.network.os_vif_util [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Converting VIF {"id": "e73e78a1-01c0-43c0-b191-97cbd6fd06aa", "address": "fa:16:3e:46:c1:55", "network": {"id": "fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1009939787-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3bc30bc6516c4e49aed5726171c74d6f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tape73e78a1-01", "ovs_interfaceid": "e73e78a1-01c0-43c0-b191-97cbd6fd06aa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:51:30 compute-0 nova_compute[189265]: 2025-09-30 07:51:30.691 2 DEBUG nova.network.os_vif_util [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:46:c1:55,bridge_name='br-int',has_traffic_filtering=True,id=e73e78a1-01c0-43c0-b191-97cbd6fd06aa,network=Network(fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape73e78a1-01') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:51:30 compute-0 nova_compute[189265]: 2025-09-30 07:51:30.692 2 DEBUG os_vif [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:46:c1:55,bridge_name='br-int',has_traffic_filtering=True,id=e73e78a1-01c0-43c0-b191-97cbd6fd06aa,network=Network(fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape73e78a1-01') plug /usr/lib/python3.12/site-packages/os_vif/__init__.py:76
Sep 30 07:51:30 compute-0 nova_compute[189265]: 2025-09-30 07:51:30.693 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:30 compute-0 nova_compute[189265]: 2025-09-30 07:51:30.693 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:51:30 compute-0 nova_compute[189265]: 2025-09-30 07:51:30.694 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:51:30 compute-0 nova_compute[189265]: 2025-09-30 07:51:30.694 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:30 compute-0 nova_compute[189265]: 2025-09-30 07:51:30.695 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbCreateCommand(_result=None, table=QoS, columns={'type': 'linux-noop', 'external_ids': {'id': '801f9ad2-86ab-5d21-b938-ad4be67083a1', '_type': 'linux-noop'}}, row=False) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:51:30 compute-0 nova_compute[189265]: 2025-09-30 07:51:30.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:30 compute-0 nova_compute[189265]: 2025-09-30 07:51:30.698 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:30 compute-0 nova_compute[189265]: 2025-09-30 07:51:30.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:30 compute-0 nova_compute[189265]: 2025-09-30 07:51:30.701 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape73e78a1-01, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:51:30 compute-0 nova_compute[189265]: 2025-09-30 07:51:30.701 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Port, record=tape73e78a1-01, col_values=(('qos', UUID('c736cfee-ea68-4925-9852-e0a3d3475417')),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:51:30 compute-0 nova_compute[189265]: 2025-09-30 07:51:30.701 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=2): DbSetCommand(_result=None, table=Interface, record=tape73e78a1-01, col_values=(('external_ids', {'iface-id': 'e73e78a1-01c0-43c0-b191-97cbd6fd06aa', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:46:c1:55', 'vm-uuid': '58023b4a-dc78-4f0b-a216-57b512b9561c'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:51:30 compute-0 nova_compute[189265]: 2025-09-30 07:51:30.702 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:30 compute-0 NetworkManager[51813]: <info>  [1759218690.7040] manager: (tape73e78a1-01): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/105)
Sep 30 07:51:30 compute-0 nova_compute[189265]: 2025-09-30 07:51:30.704 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:248
Sep 30 07:51:30 compute-0 nova_compute[189265]: 2025-09-30 07:51:30.712 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:30 compute-0 nova_compute[189265]: 2025-09-30 07:51:30.712 2 INFO os_vif [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:46:c1:55,bridge_name='br-int',has_traffic_filtering=True,id=e73e78a1-01c0-43c0-b191-97cbd6fd06aa,network=Network(fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape73e78a1-01')
Sep 30 07:51:30 compute-0 nova_compute[189265]: 2025-09-30 07:51:30.713 2 DEBUG nova.virt.libvirt.driver [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.12/site-packages/nova/virt/libvirt/driver.py:11851
Sep 30 07:51:30 compute-0 nova_compute[189265]: 2025-09-30 07:51:30.713 2 DEBUG nova.compute.manager [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpffdz8q9d',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='58023b4a-dc78-4f0b-a216-57b512b9561c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9377
Sep 30 07:51:30 compute-0 nova_compute[189265]: 2025-09-30 07:51:30.714 2 WARNING neutronclient.v2_0.client [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:51:30 compute-0 nova_compute[189265]: 2025-09-30 07:51:30.814 2 WARNING neutronclient.v2_0.client [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:51:30 compute-0 unix_chkpwd[227055]: password check failed for user (root)
Sep 30 07:51:30 compute-0 sshd-session[227047]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.119  user=root
Sep 30 07:51:31 compute-0 openstack_network_exporter[201859]: ERROR   07:51:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:51:31 compute-0 openstack_network_exporter[201859]: ERROR   07:51:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:51:31 compute-0 openstack_network_exporter[201859]: ERROR   07:51:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:51:31 compute-0 openstack_network_exporter[201859]: ERROR   07:51:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:51:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:51:31 compute-0 openstack_network_exporter[201859]: ERROR   07:51:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:51:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:51:31 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:31.571 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=37, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '1a:26:7c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2e:60:fa:91:d0:34'}, ipsec=False) old=SB_Global(nb_cfg=36) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:51:31 compute-0 nova_compute[189265]: 2025-09-30 07:51:31.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:31 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:31.572 100322 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 07:51:32 compute-0 nova_compute[189265]: 2025-09-30 07:51:32.039 2 DEBUG nova.network.neutron [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 58023b4a-dc78-4f0b-a216-57b512b9561c] Port e73e78a1-01c0-43c0-b191-97cbd6fd06aa updated with migration profile {'migrating_to': 'compute-0.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.12/site-packages/nova/network/neutron.py:356
Sep 30 07:51:32 compute-0 nova_compute[189265]: 2025-09-30 07:51:32.054 2 DEBUG nova.compute.manager [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_cpu_shared_set_info=set([]),dst_numa_info=<?>,dst_supports_mdev_live_migration=True,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpffdz8q9d',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='58023b4a-dc78-4f0b-a216-57b512b9561c',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},pci_dev_map_src_dst={},serial_listen_addr=None,serial_listen_ports=[],source_mdev_types=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,target_mdevs=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.12/site-packages/nova/compute/manager.py:9443
Sep 30 07:51:33 compute-0 sshd-session[227047]: Failed password for root from 80.94.93.119 port 34656 ssh2
Sep 30 07:51:34 compute-0 sshd-session[227057]: Invalid user hadoop from 159.89.22.242 port 51912
Sep 30 07:51:34 compute-0 sshd-session[227057]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:51:34 compute-0 sshd-session[227057]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=159.89.22.242
Sep 30 07:51:34 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:34.573 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=01429670-4ea1-4dab-babc-4bc628cc01bb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '37'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:51:34 compute-0 systemd[1]: Starting libvirt proxy daemon...
Sep 30 07:51:34 compute-0 systemd[1]: Started libvirt proxy daemon.
Sep 30 07:51:34 compute-0 podman[227059]: 2025-09-30 07:51:34.884853218 +0000 UTC m=+0.101027205 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 07:51:35 compute-0 kernel: tape73e78a1-01: entered promiscuous mode
Sep 30 07:51:35 compute-0 NetworkManager[51813]: <info>  [1759218695.0008] manager: (tape73e78a1-01): new Tun device (/org/freedesktop/NetworkManager/Devices/106)
Sep 30 07:51:35 compute-0 ovn_controller[91436]: 2025-09-30T07:51:35Z|00299|binding|INFO|Claiming lport e73e78a1-01c0-43c0-b191-97cbd6fd06aa for this additional chassis.
Sep 30 07:51:35 compute-0 nova_compute[189265]: 2025-09-30 07:51:35.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:35 compute-0 ovn_controller[91436]: 2025-09-30T07:51:35Z|00300|binding|INFO|e73e78a1-01c0-43c0-b191-97cbd6fd06aa: Claiming fa:16:3e:46:c1:55 10.100.0.10
Sep 30 07:51:35 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:35.055 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:46:c1:55 10.100.0.10'], port_security=['fa:16:3e:46:c1:55 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-1.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp'}, parent_port=[], requested_additional_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '58023b4a-dc78-4f0b-a216-57b512b9561c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2d470809703a44e69c2bc0d283b2bce4', 'neutron:revision_number': '10', 'neutron:security_group_ids': '33f74b46-9f43-491b-8b86-3db30861b2d8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a9ce9f20-a5bb-4f10-94ac-caac3fb7e1ad, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=e73e78a1-01c0-43c0-b191-97cbd6fd06aa) old=Port_Binding(additional_chassis=[]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:51:35 compute-0 ovn_controller[91436]: 2025-09-30T07:51:35Z|00301|binding|INFO|Setting lport e73e78a1-01c0-43c0-b191-97cbd6fd06aa ovn-installed in OVS
Sep 30 07:51:35 compute-0 nova_compute[189265]: 2025-09-30 07:51:35.056 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:35 compute-0 nova_compute[189265]: 2025-09-30 07:51:35.057 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:35 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:35.058 100322 INFO neutron.agent.ovn.metadata.agent [-] Port e73e78a1-01c0-43c0-b191-97cbd6fd06aa in datapath fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f unbound from our chassis
Sep 30 07:51:35 compute-0 nova_compute[189265]: 2025-09-30 07:51:35.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:35 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:35.061 100322 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f
Sep 30 07:51:35 compute-0 systemd-udevd[227114]: Network interface NamePolicy= disabled on kernel command line.
Sep 30 07:51:35 compute-0 systemd-machined[149233]: New machine qemu-26-instance-00000022.
Sep 30 07:51:35 compute-0 unix_chkpwd[227117]: password check failed for user (root)
Sep 30 07:51:35 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:35.087 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[bf7602f5-63ed-4488-8142-20dc03e4d5ed]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:51:35 compute-0 NetworkManager[51813]: <info>  [1759218695.0959] device (tape73e78a1-01): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Sep 30 07:51:35 compute-0 systemd[1]: Started Virtual Machine qemu-26-instance-00000022.
Sep 30 07:51:35 compute-0 NetworkManager[51813]: <info>  [1759218695.0987] device (tape73e78a1-01): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Sep 30 07:51:35 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:35.134 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[6d1611b8-3061-44cd-a221-643187e983ab]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:51:35 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:35.137 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[2e85f85a-6a1f-4464-90a6-2a03b26e9358]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:51:35 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:35.175 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[c1cb6542-e842-4a67-9cb6-468b22b5576b]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:51:35 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:35.200 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[3e853b25-993d-4c54-9b95-e321d0391d8f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfe7b13d3-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bf:21:3a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 5, 'rx_bytes': 916, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 75], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 651796, 'reachable_time': 32071, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227128, 'error': None, 'target': 'ovnmeta-fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:51:35 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:35.219 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[79f9602a-3623-4067-b244-3326a59af8f6]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfe7b13d3-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 651814, 'tstamp': 651814}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227129, 'error': None, 'target': 'ovnmeta-fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfe7b13d3-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 651819, 'tstamp': 651819}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227129, 'error': None, 'target': 'ovnmeta-fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:51:35 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:35.221 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfe7b13d3-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:51:35 compute-0 nova_compute[189265]: 2025-09-30 07:51:35.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:35 compute-0 nova_compute[189265]: 2025-09-30 07:51:35.224 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:35 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:35.224 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfe7b13d3-f0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:51:35 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:35.224 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:51:35 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:35.225 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfe7b13d3-f0, col_values=(('external_ids', {'iface-id': 'dcaa81b8-98f0-4a50-8978-177847c9169e'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:51:35 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:35.225 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:51:35 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:35.227 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[3eb4ee2d-3f55-4fea-aabe-41948322013a]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:51:35 compute-0 nova_compute[189265]: 2025-09-30 07:51:35.315 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:35 compute-0 nova_compute[189265]: 2025-09-30 07:51:35.703 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:36 compute-0 sshd-session[227057]: Failed password for invalid user hadoop from 159.89.22.242 port 51912 ssh2
Sep 30 07:51:37 compute-0 sshd-session[227047]: Failed password for root from 80.94.93.119 port 34656 ssh2
Sep 30 07:51:38 compute-0 sshd-session[227057]: Received disconnect from 159.89.22.242 port 51912:11: Bye Bye [preauth]
Sep 30 07:51:38 compute-0 sshd-session[227057]: Disconnected from invalid user hadoop 159.89.22.242 port 51912 [preauth]
Sep 30 07:51:38 compute-0 ovn_controller[91436]: 2025-09-30T07:51:38Z|00302|binding|INFO|Claiming lport e73e78a1-01c0-43c0-b191-97cbd6fd06aa for this chassis.
Sep 30 07:51:38 compute-0 ovn_controller[91436]: 2025-09-30T07:51:38Z|00303|binding|INFO|e73e78a1-01c0-43c0-b191-97cbd6fd06aa: Claiming fa:16:3e:46:c1:55 10.100.0.10
Sep 30 07:51:38 compute-0 ovn_controller[91436]: 2025-09-30T07:51:38Z|00304|binding|INFO|Setting lport e73e78a1-01c0-43c0-b191-97cbd6fd06aa up in Southbound
Sep 30 07:51:39 compute-0 unix_chkpwd[227151]: password check failed for user (root)
Sep 30 07:51:39 compute-0 nova_compute[189265]: 2025-09-30 07:51:39.913 2 INFO nova.compute.manager [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 58023b4a-dc78-4f0b-a216-57b512b9561c] Post operation of migration started
Sep 30 07:51:39 compute-0 nova_compute[189265]: 2025-09-30 07:51:39.914 2 WARNING neutronclient.v2_0.client [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:51:40 compute-0 nova_compute[189265]: 2025-09-30 07:51:40.317 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:40 compute-0 nova_compute[189265]: 2025-09-30 07:51:40.336 2 WARNING neutronclient.v2_0.client [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:51:40 compute-0 nova_compute[189265]: 2025-09-30 07:51:40.337 2 WARNING neutronclient.v2_0.client [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:51:40 compute-0 nova_compute[189265]: 2025-09-30 07:51:40.447 2 DEBUG oslo_concurrency.lockutils [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "refresh_cache-58023b4a-dc78-4f0b-a216-57b512b9561c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:313
Sep 30 07:51:40 compute-0 nova_compute[189265]: 2025-09-30 07:51:40.448 2 DEBUG oslo_concurrency.lockutils [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquired lock "refresh_cache-58023b4a-dc78-4f0b-a216-57b512b9561c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:316
Sep 30 07:51:40 compute-0 nova_compute[189265]: 2025-09-30 07:51:40.448 2 DEBUG nova.network.neutron [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 58023b4a-dc78-4f0b-a216-57b512b9561c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:2070
Sep 30 07:51:40 compute-0 nova_compute[189265]: 2025-09-30 07:51:40.764 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:40 compute-0 sshd-session[227047]: Failed password for root from 80.94.93.119 port 34656 ssh2
Sep 30 07:51:40 compute-0 nova_compute[189265]: 2025-09-30 07:51:40.955 2 WARNING neutronclient.v2_0.client [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:51:41 compute-0 sshd-session[227054]: error: kex_exchange_identification: read: Connection timed out
Sep 30 07:51:41 compute-0 sshd-session[227054]: banner exchange: Connection from 14.103.242.177 port 34342: Connection timed out
Sep 30 07:51:41 compute-0 sshd-session[227047]: Received disconnect from 80.94.93.119 port 34656:11:  [preauth]
Sep 30 07:51:41 compute-0 sshd-session[227047]: Disconnected from authenticating user root 80.94.93.119 port 34656 [preauth]
Sep 30 07:51:41 compute-0 sshd-session[227047]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.119  user=root
Sep 30 07:51:41 compute-0 nova_compute[189265]: 2025-09-30 07:51:41.881 2 WARNING neutronclient.v2_0.client [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:51:42 compute-0 unix_chkpwd[227154]: password check failed for user (root)
Sep 30 07:51:42 compute-0 sshd-session[227152]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.119  user=root
Sep 30 07:51:42 compute-0 nova_compute[189265]: 2025-09-30 07:51:42.561 2 DEBUG nova.network.neutron [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 58023b4a-dc78-4f0b-a216-57b512b9561c] Updating instance_info_cache with network_info: [{"id": "e73e78a1-01c0-43c0-b191-97cbd6fd06aa", "address": "fa:16:3e:46:c1:55", "network": {"id": "fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1009939787-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3bc30bc6516c4e49aed5726171c74d6f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape73e78a1-01", "ovs_interfaceid": "e73e78a1-01c0-43c0-b191-97cbd6fd06aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:51:43 compute-0 nova_compute[189265]: 2025-09-30 07:51:43.068 2 DEBUG oslo_concurrency.lockutils [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Releasing lock "refresh_cache-58023b4a-dc78-4f0b-a216-57b512b9561c" lock /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:334
Sep 30 07:51:43 compute-0 nova_compute[189265]: 2025-09-30 07:51:43.589 2 DEBUG oslo_concurrency.lockutils [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:51:43 compute-0 nova_compute[189265]: 2025-09-30 07:51:43.590 2 DEBUG oslo_concurrency.lockutils [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:51:43 compute-0 nova_compute[189265]: 2025-09-30 07:51:43.590 2 DEBUG oslo_concurrency.lockutils [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:51:43 compute-0 nova_compute[189265]: 2025-09-30 07:51:43.596 2 INFO nova.virt.libvirt.driver [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 58023b4a-dc78-4f0b-a216-57b512b9561c] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Sep 30 07:51:43 compute-0 virtqemud[189090]: Domain id=26 name='instance-00000022' uuid=58023b4a-dc78-4f0b-a216-57b512b9561c is tainted: custom-monitor
Sep 30 07:51:44 compute-0 sshd-session[227152]: Failed password for root from 80.94.93.119 port 38044 ssh2
Sep 30 07:51:44 compute-0 nova_compute[189265]: 2025-09-30 07:51:44.603 2 INFO nova.virt.libvirt.driver [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 58023b4a-dc78-4f0b-a216-57b512b9561c] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Sep 30 07:51:45 compute-0 nova_compute[189265]: 2025-09-30 07:51:45.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:45 compute-0 podman[227155]: 2025-09-30 07:51:45.5190972 +0000 UTC m=+0.088955887 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, config_id=iscsid, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 07:51:45 compute-0 nova_compute[189265]: 2025-09-30 07:51:45.610 2 INFO nova.virt.libvirt.driver [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 58023b4a-dc78-4f0b-a216-57b512b9561c] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Sep 30 07:51:45 compute-0 nova_compute[189265]: 2025-09-30 07:51:45.616 2 DEBUG nova.compute.manager [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 58023b4a-dc78-4f0b-a216-57b512b9561c] Checking state _get_power_state /usr/lib/python3.12/site-packages/nova/compute/manager.py:1825
Sep 30 07:51:45 compute-0 nova_compute[189265]: 2025-09-30 07:51:45.766 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:46 compute-0 nova_compute[189265]: 2025-09-30 07:51:46.127 2 DEBUG nova.objects.instance [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 58023b4a-dc78-4f0b-a216-57b512b9561c] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.12/site-packages/nova/objects/instance.py:1067
Sep 30 07:51:46 compute-0 unix_chkpwd[227176]: password check failed for user (root)
Sep 30 07:51:47 compute-0 nova_compute[189265]: 2025-09-30 07:51:47.150 2 WARNING neutronclient.v2_0.client [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:51:47 compute-0 nova_compute[189265]: 2025-09-30 07:51:47.540 2 WARNING neutronclient.v2_0.client [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:51:47 compute-0 nova_compute[189265]: 2025-09-30 07:51:47.540 2 WARNING neutronclient.v2_0.client [None req-a61732a9-f7a9-4b24-a49e-d6643f62f075 e3d6072cd5b7424780fdb085c0bf3ad6 c907a5861ae4441bb28195d4a66713bf - - default default] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:51:48 compute-0 sshd-session[227152]: Failed password for root from 80.94.93.119 port 38044 ssh2
Sep 30 07:51:50 compute-0 nova_compute[189265]: 2025-09-30 07:51:50.324 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:50 compute-0 podman[227177]: 2025-09-30 07:51:50.526153387 +0000 UTC m=+0.100876850 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, architecture=x86_64, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, release=1755695350, build-date=2025-08-20T13:12:41, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Sep 30 07:51:50 compute-0 unix_chkpwd[227198]: password check failed for user (root)
Sep 30 07:51:50 compute-0 nova_compute[189265]: 2025-09-30 07:51:50.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:50 compute-0 nova_compute[189265]: 2025-09-30 07:51:50.857 2 DEBUG oslo_concurrency.lockutils [None req-cb66f099-8dec-4e16-b8db-feb0e723edb4 f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Acquiring lock "81f3990d-a6c1-43db-aa50-6b771758b8fb" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:51:50 compute-0 nova_compute[189265]: 2025-09-30 07:51:50.857 2 DEBUG oslo_concurrency.lockutils [None req-cb66f099-8dec-4e16-b8db-feb0e723edb4 f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Lock "81f3990d-a6c1-43db-aa50-6b771758b8fb" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:51:50 compute-0 nova_compute[189265]: 2025-09-30 07:51:50.858 2 DEBUG oslo_concurrency.lockutils [None req-cb66f099-8dec-4e16-b8db-feb0e723edb4 f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Acquiring lock "81f3990d-a6c1-43db-aa50-6b771758b8fb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:51:50 compute-0 nova_compute[189265]: 2025-09-30 07:51:50.858 2 DEBUG oslo_concurrency.lockutils [None req-cb66f099-8dec-4e16-b8db-feb0e723edb4 f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Lock "81f3990d-a6c1-43db-aa50-6b771758b8fb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:51:50 compute-0 nova_compute[189265]: 2025-09-30 07:51:50.859 2 DEBUG oslo_concurrency.lockutils [None req-cb66f099-8dec-4e16-b8db-feb0e723edb4 f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Lock "81f3990d-a6c1-43db-aa50-6b771758b8fb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:51:50 compute-0 nova_compute[189265]: 2025-09-30 07:51:50.872 2 INFO nova.compute.manager [None req-cb66f099-8dec-4e16-b8db-feb0e723edb4 f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Terminating instance
Sep 30 07:51:51 compute-0 nova_compute[189265]: 2025-09-30 07:51:51.391 2 DEBUG nova.compute.manager [None req-cb66f099-8dec-4e16-b8db-feb0e723edb4 f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 07:51:51 compute-0 kernel: tap76fa05ca-c2 (unregistering): left promiscuous mode
Sep 30 07:51:51 compute-0 NetworkManager[51813]: <info>  [1759218711.4212] device (tap76fa05ca-c2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 07:51:51 compute-0 nova_compute[189265]: 2025-09-30 07:51:51.468 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:51 compute-0 ovn_controller[91436]: 2025-09-30T07:51:51Z|00305|binding|INFO|Releasing lport 76fa05ca-c22d-48b8-b82e-ed72d1971c4c from this chassis (sb_readonly=0)
Sep 30 07:51:51 compute-0 ovn_controller[91436]: 2025-09-30T07:51:51Z|00306|binding|INFO|Setting lport 76fa05ca-c22d-48b8-b82e-ed72d1971c4c down in Southbound
Sep 30 07:51:51 compute-0 ovn_controller[91436]: 2025-09-30T07:51:51Z|00307|binding|INFO|Removing iface tap76fa05ca-c2 ovn-installed in OVS
Sep 30 07:51:51 compute-0 nova_compute[189265]: 2025-09-30 07:51:51.470 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:51 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:51.495 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:78:e7:1d 10.100.0.14'], port_security=['fa:16:3e:78:e7:1d 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '81f3990d-a6c1-43db-aa50-6b771758b8fb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2d470809703a44e69c2bc0d283b2bce4', 'neutron:revision_number': '5', 'neutron:security_group_ids': '33f74b46-9f43-491b-8b86-3db30861b2d8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a9ce9f20-a5bb-4f10-94ac-caac3fb7e1ad, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], logical_port=76fa05ca-c22d-48b8-b82e-ed72d1971c4c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:51:51 compute-0 nova_compute[189265]: 2025-09-30 07:51:51.495 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:51 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:51.496 100322 INFO neutron.agent.ovn.metadata.agent [-] Port 76fa05ca-c22d-48b8-b82e-ed72d1971c4c in datapath fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f unbound from our chassis
Sep 30 07:51:51 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:51.500 100322 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f
Sep 30 07:51:51 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:51.521 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[acb83c60-4e63-4777-878d-9537d1458c20]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:51:51 compute-0 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d00000023.scope: Deactivated successfully.
Sep 30 07:51:51 compute-0 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d00000023.scope: Consumed 14.043s CPU time.
Sep 30 07:51:51 compute-0 systemd-machined[149233]: Machine qemu-25-instance-00000023 terminated.
Sep 30 07:51:51 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:51.569 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[ef6ed754-877d-43bf-979d-38983ad13a71]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:51:51 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:51.572 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[1493ac63-da59-4849-b532-939572da5216]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:51:51 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:51.613 212743 DEBUG oslo.privsep.daemon [-] privsep: reply[199ff47d-7e53-487c-86bc-eea34564ad50]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:51:51 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:51.641 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[08386781-0b7d-4046-92d4-6ef785dc8ddd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfe7b13d3-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bf:21:3a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 30, 'tx_packets': 7, 'rx_bytes': 1756, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 75], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 651796, 'reachable_time': 32071, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227215, 'error': None, 'target': 'ovnmeta-fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:51:51 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:51.672 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[69336f02-2d8c-42c5-87d5-2c356376d996]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfe7b13d3-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 651814, 'tstamp': 651814}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227226, 'error': None, 'target': 'ovnmeta-fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfe7b13d3-f1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 651819, 'tstamp': 651819}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227226, 'error': None, 'target': 'ovnmeta-fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:51:51 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:51.674 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfe7b13d3-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:51:51 compute-0 nova_compute[189265]: 2025-09-30 07:51:51.675 2 INFO nova.virt.libvirt.driver [-] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Instance destroyed successfully.
Sep 30 07:51:51 compute-0 nova_compute[189265]: 2025-09-30 07:51:51.675 2 DEBUG nova.objects.instance [None req-cb66f099-8dec-4e16-b8db-feb0e723edb4 f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Lazy-loading 'resources' on Instance uuid 81f3990d-a6c1-43db-aa50-6b771758b8fb obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:51:51 compute-0 nova_compute[189265]: 2025-09-30 07:51:51.676 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:51 compute-0 nova_compute[189265]: 2025-09-30 07:51:51.684 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:51 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:51.684 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfe7b13d3-f0, may_exist=True, interface_attrs={}) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:51:51 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:51.685 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:51:51 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:51.685 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfe7b13d3-f0, col_values=(('external_ids', {'iface-id': 'dcaa81b8-98f0-4a50-8978-177847c9169e'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:51:51 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:51.686 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Sep 30 07:51:51 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:51.687 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[dde85022-4188-4cb2-8902-f6d7cf6b666e]: (4, '\nglobal\n    log         /dev/log local0 debug\n    log-tag     haproxy-metadata-proxy-fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f\n    user        root\n    group       root\n    maxconn     1024\n    pidfile     /var/lib/neutron/external/pids/fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f.pid.haproxy\n    daemon\n\ndefaults\n    log global\n    mode http\n    option httplog\n    option dontlognull\n    option http-server-close\n    option forwardfor\n    retries                 3\n    timeout http-request    30s\n    timeout connect         30s\n    timeout client          32s\n    timeout server          32s\n    timeout http-keep-alive 30s\n\nlisten listener\n    bind 169.254.169.254:80\n    \n    server metadata /var/lib/neutron/metadata_proxy\n\n    http-request add-header X-OVN-Network-ID fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f\n') _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:51:51 compute-0 nova_compute[189265]: 2025-09-30 07:51:51.711 2 DEBUG nova.compute.manager [req-b10dc54d-a191-4af6-bd2b-934d79289f04 req-c67c8732-48df-43c1-b250-396b36c4d279 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Received event network-vif-unplugged-76fa05ca-c22d-48b8-b82e-ed72d1971c4c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:51:51 compute-0 nova_compute[189265]: 2025-09-30 07:51:51.711 2 DEBUG oslo_concurrency.lockutils [req-b10dc54d-a191-4af6-bd2b-934d79289f04 req-c67c8732-48df-43c1-b250-396b36c4d279 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "81f3990d-a6c1-43db-aa50-6b771758b8fb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:51:51 compute-0 nova_compute[189265]: 2025-09-30 07:51:51.711 2 DEBUG oslo_concurrency.lockutils [req-b10dc54d-a191-4af6-bd2b-934d79289f04 req-c67c8732-48df-43c1-b250-396b36c4d279 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "81f3990d-a6c1-43db-aa50-6b771758b8fb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:51:51 compute-0 nova_compute[189265]: 2025-09-30 07:51:51.712 2 DEBUG oslo_concurrency.lockutils [req-b10dc54d-a191-4af6-bd2b-934d79289f04 req-c67c8732-48df-43c1-b250-396b36c4d279 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "81f3990d-a6c1-43db-aa50-6b771758b8fb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:51:51 compute-0 nova_compute[189265]: 2025-09-30 07:51:51.712 2 DEBUG nova.compute.manager [req-b10dc54d-a191-4af6-bd2b-934d79289f04 req-c67c8732-48df-43c1-b250-396b36c4d279 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] No waiting events found dispatching network-vif-unplugged-76fa05ca-c22d-48b8-b82e-ed72d1971c4c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:51:51 compute-0 nova_compute[189265]: 2025-09-30 07:51:51.712 2 DEBUG nova.compute.manager [req-b10dc54d-a191-4af6-bd2b-934d79289f04 req-c67c8732-48df-43c1-b250-396b36c4d279 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Received event network-vif-unplugged-76fa05ca-c22d-48b8-b82e-ed72d1971c4c for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:51:52 compute-0 nova_compute[189265]: 2025-09-30 07:51:52.183 2 DEBUG nova.virt.libvirt.vif [None req-cb66f099-8dec-4e16-b8db-feb0e723edb4 f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,compute_id=2,config_drive='True',created_at=2025-09-30T07:50:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-935657189',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-935657189',id=35,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T07:51:02Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2d470809703a44e69c2bc0d283b2bce4',ramdisk_id='',reservation_id='r-jwcbn16t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-642893642',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-642893642-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T07:51:02Z,user_data=None,user_id='f267aaead8a3437a8359b21224982b1c',uuid=81f3990d-a6c1-43db-aa50-6b771758b8fb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "76fa05ca-c22d-48b8-b82e-ed72d1971c4c", "address": "fa:16:3e:78:e7:1d", "network": {"id": "fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1009939787-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3bc30bc6516c4e49aed5726171c74d6f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap76fa05ca-c2", "ovs_interfaceid": "76fa05ca-c22d-48b8-b82e-ed72d1971c4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 07:51:52 compute-0 nova_compute[189265]: 2025-09-30 07:51:52.184 2 DEBUG nova.network.os_vif_util [None req-cb66f099-8dec-4e16-b8db-feb0e723edb4 f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Converting VIF {"id": "76fa05ca-c22d-48b8-b82e-ed72d1971c4c", "address": "fa:16:3e:78:e7:1d", "network": {"id": "fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1009939787-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3bc30bc6516c4e49aed5726171c74d6f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap76fa05ca-c2", "ovs_interfaceid": "76fa05ca-c22d-48b8-b82e-ed72d1971c4c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:51:52 compute-0 nova_compute[189265]: 2025-09-30 07:51:52.185 2 DEBUG nova.network.os_vif_util [None req-cb66f099-8dec-4e16-b8db-feb0e723edb4 f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:78:e7:1d,bridge_name='br-int',has_traffic_filtering=True,id=76fa05ca-c22d-48b8-b82e-ed72d1971c4c,network=Network(fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap76fa05ca-c2') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:51:52 compute-0 nova_compute[189265]: 2025-09-30 07:51:52.186 2 DEBUG os_vif [None req-cb66f099-8dec-4e16-b8db-feb0e723edb4 f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:78:e7:1d,bridge_name='br-int',has_traffic_filtering=True,id=76fa05ca-c22d-48b8-b82e-ed72d1971c4c,network=Network(fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap76fa05ca-c2') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 07:51:52 compute-0 nova_compute[189265]: 2025-09-30 07:51:52.189 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:52 compute-0 nova_compute[189265]: 2025-09-30 07:51:52.189 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap76fa05ca-c2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:51:52 compute-0 nova_compute[189265]: 2025-09-30 07:51:52.191 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:52 compute-0 nova_compute[189265]: 2025-09-30 07:51:52.193 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:52 compute-0 nova_compute[189265]: 2025-09-30 07:51:52.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:52 compute-0 nova_compute[189265]: 2025-09-30 07:51:52.194 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=697b96ce-ba1d-4a0e-87b7-482421308bcd) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:51:52 compute-0 nova_compute[189265]: 2025-09-30 07:51:52.195 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:52 compute-0 nova_compute[189265]: 2025-09-30 07:51:52.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:52 compute-0 nova_compute[189265]: 2025-09-30 07:51:52.198 2 INFO os_vif [None req-cb66f099-8dec-4e16-b8db-feb0e723edb4 f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:78:e7:1d,bridge_name='br-int',has_traffic_filtering=True,id=76fa05ca-c22d-48b8-b82e-ed72d1971c4c,network=Network(fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap76fa05ca-c2')
Sep 30 07:51:52 compute-0 nova_compute[189265]: 2025-09-30 07:51:52.199 2 INFO nova.virt.libvirt.driver [None req-cb66f099-8dec-4e16-b8db-feb0e723edb4 f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Deleting instance files /var/lib/nova/instances/81f3990d-a6c1-43db-aa50-6b771758b8fb_del
Sep 30 07:51:52 compute-0 nova_compute[189265]: 2025-09-30 07:51:52.200 2 INFO nova.virt.libvirt.driver [None req-cb66f099-8dec-4e16-b8db-feb0e723edb4 f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Deletion of /var/lib/nova/instances/81f3990d-a6c1-43db-aa50-6b771758b8fb_del complete
Sep 30 07:51:52 compute-0 sshd-session[227152]: Failed password for root from 80.94.93.119 port 38044 ssh2
Sep 30 07:51:52 compute-0 nova_compute[189265]: 2025-09-30 07:51:52.719 2 INFO nova.compute.manager [None req-cb66f099-8dec-4e16-b8db-feb0e723edb4 f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Took 1.33 seconds to destroy the instance on the hypervisor.
Sep 30 07:51:52 compute-0 nova_compute[189265]: 2025-09-30 07:51:52.719 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-cb66f099-8dec-4e16-b8db-feb0e723edb4 f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 07:51:52 compute-0 nova_compute[189265]: 2025-09-30 07:51:52.720 2 DEBUG nova.compute.manager [-] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 07:51:52 compute-0 nova_compute[189265]: 2025-09-30 07:51:52.720 2 DEBUG nova.network.neutron [-] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 07:51:52 compute-0 nova_compute[189265]: 2025-09-30 07:51:52.720 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:51:52 compute-0 nova_compute[189265]: 2025-09-30 07:51:52.867 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:51:53 compute-0 nova_compute[189265]: 2025-09-30 07:51:53.215 2 DEBUG nova.compute.manager [req-e2f7ac01-f9ca-4fad-be3d-be9e82a06b28 req-8d660ffe-4155-4cf3-867f-1ea89e78b3b5 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Received event network-vif-deleted-76fa05ca-c22d-48b8-b82e-ed72d1971c4c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:51:53 compute-0 nova_compute[189265]: 2025-09-30 07:51:53.215 2 INFO nova.compute.manager [req-e2f7ac01-f9ca-4fad-be3d-be9e82a06b28 req-8d660ffe-4155-4cf3-867f-1ea89e78b3b5 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Neutron deleted interface 76fa05ca-c22d-48b8-b82e-ed72d1971c4c; detaching it from the instance and deleting it from the info cache
Sep 30 07:51:53 compute-0 nova_compute[189265]: 2025-09-30 07:51:53.215 2 DEBUG nova.network.neutron [req-e2f7ac01-f9ca-4fad-be3d-be9e82a06b28 req-8d660ffe-4155-4cf3-867f-1ea89e78b3b5 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:51:53 compute-0 podman[227231]: 2025-09-30 07:51:53.532663499 +0000 UTC m=+0.100067247 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Sep 30 07:51:53 compute-0 podman[227232]: 2025-09-30 07:51:53.533160994 +0000 UTC m=+0.096464924 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Sep 30 07:51:53 compute-0 podman[227233]: 2025-09-30 07:51:53.564643031 +0000 UTC m=+0.124419709 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Sep 30 07:51:53 compute-0 nova_compute[189265]: 2025-09-30 07:51:53.638 2 DEBUG nova.network.neutron [-] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:51:53 compute-0 nova_compute[189265]: 2025-09-30 07:51:53.724 2 DEBUG nova.compute.manager [req-e2f7ac01-f9ca-4fad-be3d-be9e82a06b28 req-8d660ffe-4155-4cf3-867f-1ea89e78b3b5 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Detach interface failed, port_id=76fa05ca-c22d-48b8-b82e-ed72d1971c4c, reason: Instance 81f3990d-a6c1-43db-aa50-6b771758b8fb could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Sep 30 07:51:53 compute-0 nova_compute[189265]: 2025-09-30 07:51:53.786 2 DEBUG nova.compute.manager [req-9b88d502-0a86-418f-bd7c-f91bd4e4cc42 req-1106a41c-713f-46f1-aa87-2af401824e9c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Received event network-vif-unplugged-76fa05ca-c22d-48b8-b82e-ed72d1971c4c external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:51:53 compute-0 nova_compute[189265]: 2025-09-30 07:51:53.787 2 DEBUG oslo_concurrency.lockutils [req-9b88d502-0a86-418f-bd7c-f91bd4e4cc42 req-1106a41c-713f-46f1-aa87-2af401824e9c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "81f3990d-a6c1-43db-aa50-6b771758b8fb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:51:53 compute-0 nova_compute[189265]: 2025-09-30 07:51:53.787 2 DEBUG oslo_concurrency.lockutils [req-9b88d502-0a86-418f-bd7c-f91bd4e4cc42 req-1106a41c-713f-46f1-aa87-2af401824e9c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "81f3990d-a6c1-43db-aa50-6b771758b8fb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:51:53 compute-0 nova_compute[189265]: 2025-09-30 07:51:53.787 2 DEBUG oslo_concurrency.lockutils [req-9b88d502-0a86-418f-bd7c-f91bd4e4cc42 req-1106a41c-713f-46f1-aa87-2af401824e9c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "81f3990d-a6c1-43db-aa50-6b771758b8fb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:51:53 compute-0 nova_compute[189265]: 2025-09-30 07:51:53.788 2 DEBUG nova.compute.manager [req-9b88d502-0a86-418f-bd7c-f91bd4e4cc42 req-1106a41c-713f-46f1-aa87-2af401824e9c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] No waiting events found dispatching network-vif-unplugged-76fa05ca-c22d-48b8-b82e-ed72d1971c4c pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:51:53 compute-0 nova_compute[189265]: 2025-09-30 07:51:53.788 2 DEBUG nova.compute.manager [req-9b88d502-0a86-418f-bd7c-f91bd4e4cc42 req-1106a41c-713f-46f1-aa87-2af401824e9c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Received event network-vif-unplugged-76fa05ca-c22d-48b8-b82e-ed72d1971c4c for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:51:54 compute-0 nova_compute[189265]: 2025-09-30 07:51:54.145 2 INFO nova.compute.manager [-] [instance: 81f3990d-a6c1-43db-aa50-6b771758b8fb] Took 1.43 seconds to deallocate network for instance.
Sep 30 07:51:54 compute-0 nova_compute[189265]: 2025-09-30 07:51:54.673 2 DEBUG oslo_concurrency.lockutils [None req-cb66f099-8dec-4e16-b8db-feb0e723edb4 f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:51:54 compute-0 nova_compute[189265]: 2025-09-30 07:51:54.673 2 DEBUG oslo_concurrency.lockutils [None req-cb66f099-8dec-4e16-b8db-feb0e723edb4 f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:51:54 compute-0 sshd-session[227152]: Received disconnect from 80.94.93.119 port 38044:11:  [preauth]
Sep 30 07:51:54 compute-0 sshd-session[227152]: Disconnected from authenticating user root 80.94.93.119 port 38044 [preauth]
Sep 30 07:51:54 compute-0 sshd-session[227152]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=80.94.93.119  user=root
Sep 30 07:51:54 compute-0 nova_compute[189265]: 2025-09-30 07:51:54.765 2 DEBUG nova.compute.provider_tree [None req-cb66f099-8dec-4e16-b8db-feb0e723edb4 f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:51:55 compute-0 nova_compute[189265]: 2025-09-30 07:51:55.273 2 DEBUG nova.scheduler.client.report [None req-cb66f099-8dec-4e16-b8db-feb0e723edb4 f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:51:55 compute-0 nova_compute[189265]: 2025-09-30 07:51:55.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:55 compute-0 nova_compute[189265]: 2025-09-30 07:51:55.787 2 DEBUG oslo_concurrency.lockutils [None req-cb66f099-8dec-4e16-b8db-feb0e723edb4 f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.113s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:51:55 compute-0 nova_compute[189265]: 2025-09-30 07:51:55.813 2 INFO nova.scheduler.client.report [None req-cb66f099-8dec-4e16-b8db-feb0e723edb4 f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Deleted allocations for instance 81f3990d-a6c1-43db-aa50-6b771758b8fb
Sep 30 07:51:56 compute-0 nova_compute[189265]: 2025-09-30 07:51:56.847 2 DEBUG oslo_concurrency.lockutils [None req-cb66f099-8dec-4e16-b8db-feb0e723edb4 f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Lock "81f3990d-a6c1-43db-aa50-6b771758b8fb" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.990s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:51:57 compute-0 nova_compute[189265]: 2025-09-30 07:51:57.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:57 compute-0 nova_compute[189265]: 2025-09-30 07:51:57.788 2 DEBUG oslo_concurrency.lockutils [None req-8701a489-5a6b-4bd1-ab3f-bc8163f7ff08 f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Acquiring lock "58023b4a-dc78-4f0b-a216-57b512b9561c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:51:57 compute-0 nova_compute[189265]: 2025-09-30 07:51:57.789 2 DEBUG oslo_concurrency.lockutils [None req-8701a489-5a6b-4bd1-ab3f-bc8163f7ff08 f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Lock "58023b4a-dc78-4f0b-a216-57b512b9561c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:51:57 compute-0 nova_compute[189265]: 2025-09-30 07:51:57.790 2 DEBUG oslo_concurrency.lockutils [None req-8701a489-5a6b-4bd1-ab3f-bc8163f7ff08 f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Acquiring lock "58023b4a-dc78-4f0b-a216-57b512b9561c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:51:57 compute-0 nova_compute[189265]: 2025-09-30 07:51:57.790 2 DEBUG oslo_concurrency.lockutils [None req-8701a489-5a6b-4bd1-ab3f-bc8163f7ff08 f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Lock "58023b4a-dc78-4f0b-a216-57b512b9561c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:51:57 compute-0 nova_compute[189265]: 2025-09-30 07:51:57.790 2 DEBUG oslo_concurrency.lockutils [None req-8701a489-5a6b-4bd1-ab3f-bc8163f7ff08 f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Lock "58023b4a-dc78-4f0b-a216-57b512b9561c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:51:57 compute-0 nova_compute[189265]: 2025-09-30 07:51:57.808 2 INFO nova.compute.manager [None req-8701a489-5a6b-4bd1-ab3f-bc8163f7ff08 f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] [instance: 58023b4a-dc78-4f0b-a216-57b512b9561c] Terminating instance
Sep 30 07:51:58 compute-0 nova_compute[189265]: 2025-09-30 07:51:58.329 2 DEBUG nova.compute.manager [None req-8701a489-5a6b-4bd1-ab3f-bc8163f7ff08 f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] [instance: 58023b4a-dc78-4f0b-a216-57b512b9561c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.12/site-packages/nova/compute/manager.py:3197
Sep 30 07:51:58 compute-0 kernel: tape73e78a1-01 (unregistering): left promiscuous mode
Sep 30 07:51:58 compute-0 NetworkManager[51813]: <info>  [1759218718.3634] device (tape73e78a1-01): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Sep 30 07:51:58 compute-0 ovn_controller[91436]: 2025-09-30T07:51:58Z|00308|binding|INFO|Releasing lport e73e78a1-01c0-43c0-b191-97cbd6fd06aa from this chassis (sb_readonly=0)
Sep 30 07:51:58 compute-0 ovn_controller[91436]: 2025-09-30T07:51:58Z|00309|binding|INFO|Setting lport e73e78a1-01c0-43c0-b191-97cbd6fd06aa down in Southbound
Sep 30 07:51:58 compute-0 ovn_controller[91436]: 2025-09-30T07:51:58Z|00310|binding|INFO|Removing iface tape73e78a1-01 ovn-installed in OVS
Sep 30 07:51:58 compute-0 nova_compute[189265]: 2025-09-30 07:51:58.381 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:58 compute-0 nova_compute[189265]: 2025-09-30 07:51:58.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:58 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:58.408 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:46:c1:55 10.100.0.10'], port_security=['fa:16:3e:46:c1:55 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-0.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '58023b4a-dc78-4f0b-a216-57b512b9561c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2d470809703a44e69c2bc0d283b2bce4', 'neutron:revision_number': '16', 'neutron:security_group_ids': '33f74b46-9f43-491b-8b86-3db30861b2d8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a9ce9f20-a5bb-4f10-94ac-caac3fb7e1ad, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>], logical_port=e73e78a1-01c0-43c0-b191-97cbd6fd06aa) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7fbeac5a23f0>]) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:51:58 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:58.410 100322 INFO neutron.agent.ovn.metadata.agent [-] Port e73e78a1-01c0-43c0-b191-97cbd6fd06aa in datapath fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f unbound from our chassis
Sep 30 07:51:58 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:58.412 100322 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:756
Sep 30 07:51:58 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:58.414 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[5538a7e1-0d5f-45fe-a5fd-6fd1c56e9d00]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:51:58 compute-0 nova_compute[189265]: 2025-09-30 07:51:58.413 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:58 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:58.415 100322 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f namespace which is not needed anymore
Sep 30 07:51:58 compute-0 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d00000022.scope: Deactivated successfully.
Sep 30 07:51:58 compute-0 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d00000022.scope: Consumed 3.293s CPU time.
Sep 30 07:51:58 compute-0 systemd-machined[149233]: Machine qemu-26-instance-00000022 terminated.
Sep 30 07:51:58 compute-0 nova_compute[189265]: 2025-09-30 07:51:58.508 2 DEBUG nova.compute.manager [req-2b6c9d06-3385-42ec-8b19-280f603d0d43 req-cc0d8e5e-4b8e-4cc9-be23-d6cf2d26ab7c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 58023b4a-dc78-4f0b-a216-57b512b9561c] Received event network-vif-unplugged-e73e78a1-01c0-43c0-b191-97cbd6fd06aa external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:51:58 compute-0 nova_compute[189265]: 2025-09-30 07:51:58.509 2 DEBUG oslo_concurrency.lockutils [req-2b6c9d06-3385-42ec-8b19-280f603d0d43 req-cc0d8e5e-4b8e-4cc9-be23-d6cf2d26ab7c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "58023b4a-dc78-4f0b-a216-57b512b9561c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:51:58 compute-0 nova_compute[189265]: 2025-09-30 07:51:58.510 2 DEBUG oslo_concurrency.lockutils [req-2b6c9d06-3385-42ec-8b19-280f603d0d43 req-cc0d8e5e-4b8e-4cc9-be23-d6cf2d26ab7c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "58023b4a-dc78-4f0b-a216-57b512b9561c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:51:58 compute-0 nova_compute[189265]: 2025-09-30 07:51:58.510 2 DEBUG oslo_concurrency.lockutils [req-2b6c9d06-3385-42ec-8b19-280f603d0d43 req-cc0d8e5e-4b8e-4cc9-be23-d6cf2d26ab7c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "58023b4a-dc78-4f0b-a216-57b512b9561c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:51:58 compute-0 nova_compute[189265]: 2025-09-30 07:51:58.510 2 DEBUG nova.compute.manager [req-2b6c9d06-3385-42ec-8b19-280f603d0d43 req-cc0d8e5e-4b8e-4cc9-be23-d6cf2d26ab7c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 58023b4a-dc78-4f0b-a216-57b512b9561c] No waiting events found dispatching network-vif-unplugged-e73e78a1-01c0-43c0-b191-97cbd6fd06aa pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:51:58 compute-0 nova_compute[189265]: 2025-09-30 07:51:58.511 2 DEBUG nova.compute.manager [req-2b6c9d06-3385-42ec-8b19-280f603d0d43 req-cc0d8e5e-4b8e-4cc9-be23-d6cf2d26ab7c 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 58023b4a-dc78-4f0b-a216-57b512b9561c] Received event network-vif-unplugged-e73e78a1-01c0-43c0-b191-97cbd6fd06aa for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:51:58 compute-0 nova_compute[189265]: 2025-09-30 07:51:58.604 2 INFO nova.virt.libvirt.driver [-] [instance: 58023b4a-dc78-4f0b-a216-57b512b9561c] Instance destroyed successfully.
Sep 30 07:51:58 compute-0 nova_compute[189265]: 2025-09-30 07:51:58.605 2 DEBUG nova.objects.instance [None req-8701a489-5a6b-4bd1-ab3f-bc8163f7ff08 f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Lazy-loading 'resources' on Instance uuid 58023b4a-dc78-4f0b-a216-57b512b9561c obj_load_attr /usr/lib/python3.12/site-packages/nova/objects/instance.py:1141
Sep 30 07:51:58 compute-0 neutron-haproxy-ovnmeta-fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f[226862]: [NOTICE]   (226866) : haproxy version is 3.0.5-8e879a5
Sep 30 07:51:58 compute-0 neutron-haproxy-ovnmeta-fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f[226862]: [NOTICE]   (226866) : path to executable is /usr/sbin/haproxy
Sep 30 07:51:58 compute-0 neutron-haproxy-ovnmeta-fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f[226862]: [WARNING]  (226866) : Exiting Master process...
Sep 30 07:51:58 compute-0 neutron-haproxy-ovnmeta-fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f[226862]: [ALERT]    (226866) : Current worker (226868) exited with code 143 (Terminated)
Sep 30 07:51:58 compute-0 neutron-haproxy-ovnmeta-fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f[226862]: [WARNING]  (226866) : All workers exited. Exiting... (0)
Sep 30 07:51:58 compute-0 podman[227317]: 2025-09-30 07:51:58.61335184 +0000 UTC m=+0.064048558 container kill b7edb61101377cdf61549f13f36e105a5dfe6ab712c01101ed065bbc5cb8b2c2 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4)
Sep 30 07:51:58 compute-0 systemd[1]: libpod-b7edb61101377cdf61549f13f36e105a5dfe6ab712c01101ed065bbc5cb8b2c2.scope: Deactivated successfully.
Sep 30 07:51:58 compute-0 podman[227349]: 2025-09-30 07:51:58.66327283 +0000 UTC m=+0.030427039 container died b7edb61101377cdf61549f13f36e105a5dfe6ab712c01101ed065bbc5cb8b2c2 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS)
Sep 30 07:51:58 compute-0 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b7edb61101377cdf61549f13f36e105a5dfe6ab712c01101ed065bbc5cb8b2c2-userdata-shm.mount: Deactivated successfully.
Sep 30 07:51:58 compute-0 systemd[1]: var-lib-containers-storage-overlay-976a317208e8e60d58c0cbef569bf75e88364e311dd8e1bc03a80c9a74d0bc8c-merged.mount: Deactivated successfully.
Sep 30 07:51:58 compute-0 podman[227349]: 2025-09-30 07:51:58.722428096 +0000 UTC m=+0.089582315 container cleanup b7edb61101377cdf61549f13f36e105a5dfe6ab712c01101ed065bbc5cb8b2c2 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, io.buildah.version=1.41.4)
Sep 30 07:51:58 compute-0 systemd[1]: libpod-conmon-b7edb61101377cdf61549f13f36e105a5dfe6ab712c01101ed065bbc5cb8b2c2.scope: Deactivated successfully.
Sep 30 07:51:58 compute-0 podman[227356]: 2025-09-30 07:51:58.747836739 +0000 UTC m=+0.086522087 container remove b7edb61101377cdf61549f13f36e105a5dfe6ab712c01101ed065bbc5cb8b2c2 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=neutron-haproxy-ovnmeta-fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Sep 30 07:51:58 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:58.771 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[364c0d23-a67e-4642-be36-dcec820bd7de]: (4, ("Tue Sep 30 07:51:58 AM UTC 2025 Sending signal '15' to neutron-haproxy-ovnmeta-fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f (b7edb61101377cdf61549f13f36e105a5dfe6ab712c01101ed065bbc5cb8b2c2)\nb7edb61101377cdf61549f13f36e105a5dfe6ab712c01101ed065bbc5cb8b2c2\nTue Sep 30 07:51:58 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f (b7edb61101377cdf61549f13f36e105a5dfe6ab712c01101ed065bbc5cb8b2c2)\nb7edb61101377cdf61549f13f36e105a5dfe6ab712c01101ed065bbc5cb8b2c2\n", '', 0)) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:51:58 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:58.774 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[f5bb5ee5-112c-4983-8fba-7adbd1bcb9d3]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:51:58 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:58.774 100322 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f.pid.haproxy' get_value_from_file /usr/lib/python3.12/site-packages/neutron/agent/linux/utils.py:268
Sep 30 07:51:58 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:58.775 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[e6b9e302-1903-42e7-b26a-3360339b10c9]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:51:58 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:58.776 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfe7b13d3-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:51:58 compute-0 nova_compute[189265]: 2025-09-30 07:51:58.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:58 compute-0 kernel: tapfe7b13d3-f0: left promiscuous mode
Sep 30 07:51:58 compute-0 nova_compute[189265]: 2025-09-30 07:51:58.809 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:58 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:58.813 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[27ad96f9-d30c-4674-85e3-a16c26305e9d]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:51:58 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:58.840 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[df452e52-2d84-4084-90e1-5284aaea91c2]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:51:58 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:58.842 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[8bc12238-2578-4d7c-9e01-fdf47ca87289]: (4, True) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:51:58 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:58.869 210650 DEBUG oslo.privsep.daemon [-] privsep: reply[2811123c-e618-40e1-aa37-4a6bbd759310]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 651785, 'reachable_time': 21650, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227385, 'error': None, 'target': 'ovnmeta-fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:51:58 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:58.872 100440 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f deleted. remove_netns /usr/lib/python3.12/site-packages/neutron/privileged/agent/linux/ip_lib.py:603
Sep 30 07:51:58 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:51:58.873 100440 DEBUG oslo.privsep.daemon [-] privsep: reply[5cfd7ecd-bd88-4b9c-8b4c-203db353ed9a]: (4, None) _call_back /usr/lib/python3.12/site-packages/oslo_privsep/daemon.py:515
Sep 30 07:51:58 compute-0 systemd[1]: run-netns-ovnmeta\x2dfe7b13d3\x2df2aa\x2d4dfd\x2da4fa\x2dcc5f0ed5e32f.mount: Deactivated successfully.
Sep 30 07:51:59 compute-0 nova_compute[189265]: 2025-09-30 07:51:59.112 2 DEBUG nova.virt.libvirt.vif [None req-8701a489-5a6b-4bd1-ab3f-bc8163f7ff08 f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,compute_id=2,config_drive='True',created_at=2025-09-30T07:50:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestExecuteZoneMigrationStrategy-server-981327066',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testexecutezonemigrationstrategy-server-981327066',id=34,image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-09-30T07:50:38Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2d470809703a44e69c2bc0d283b2bce4',ramdisk_id='',reservation_id='r-jrwm4zqc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,admin,reader,manager',clean_attempts='1',image_base_image_ref='0c6b92f5-9861-49e4-862d-3ffd84520dfa',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestExecuteZoneMigrationStrategy-642893642',owner_user_name='tempest-TestExecuteZoneMigrationStrategy-642893642-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-09-30T07:51:46Z,user_data=None,user_id='f267aaead8a3437a8359b21224982b1c',uuid=58023b4a-dc78-4f0b-a216-57b512b9561c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e73e78a1-01c0-43c0-b191-97cbd6fd06aa", "address": "fa:16:3e:46:c1:55", "network": {"id": "fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1009939787-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3bc30bc6516c4e49aed5726171c74d6f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape73e78a1-01", "ovs_interfaceid": "e73e78a1-01c0-43c0-b191-97cbd6fd06aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.12/site-packages/nova/virt/libvirt/vif.py:839
Sep 30 07:51:59 compute-0 nova_compute[189265]: 2025-09-30 07:51:59.113 2 DEBUG nova.network.os_vif_util [None req-8701a489-5a6b-4bd1-ab3f-bc8163f7ff08 f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Converting VIF {"id": "e73e78a1-01c0-43c0-b191-97cbd6fd06aa", "address": "fa:16:3e:46:c1:55", "network": {"id": "fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f", "bridge": "br-int", "label": "tempest-TestExecuteZoneMigrationStrategy-1009939787-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3bc30bc6516c4e49aed5726171c74d6f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape73e78a1-01", "ovs_interfaceid": "e73e78a1-01c0-43c0-b191-97cbd6fd06aa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:511
Sep 30 07:51:59 compute-0 nova_compute[189265]: 2025-09-30 07:51:59.114 2 DEBUG nova.network.os_vif_util [None req-8701a489-5a6b-4bd1-ab3f-bc8163f7ff08 f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:46:c1:55,bridge_name='br-int',has_traffic_filtering=True,id=e73e78a1-01c0-43c0-b191-97cbd6fd06aa,network=Network(fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape73e78a1-01') nova_to_osvif_vif /usr/lib/python3.12/site-packages/nova/network/os_vif_util.py:548
Sep 30 07:51:59 compute-0 nova_compute[189265]: 2025-09-30 07:51:59.115 2 DEBUG os_vif [None req-8701a489-5a6b-4bd1-ab3f-bc8163f7ff08 f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:46:c1:55,bridge_name='br-int',has_traffic_filtering=True,id=e73e78a1-01c0-43c0-b191-97cbd6fd06aa,network=Network(fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape73e78a1-01') unplug /usr/lib/python3.12/site-packages/os_vif/__init__.py:109
Sep 30 07:51:59 compute-0 nova_compute[189265]: 2025-09-30 07:51:59.116 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:59 compute-0 nova_compute[189265]: 2025-09-30 07:51:59.117 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape73e78a1-01, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:51:59 compute-0 nova_compute[189265]: 2025-09-30 07:51:59.119 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:59 compute-0 nova_compute[189265]: 2025-09-30 07:51:59.120 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:59 compute-0 nova_compute[189265]: 2025-09-30 07:51:59.121 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:59 compute-0 nova_compute[189265]: 2025-09-30 07:51:59.122 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbDestroyCommand(_result=None, table=QoS, record=c736cfee-ea68-4925-9852-e0a3d3475417) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:51:59 compute-0 nova_compute[189265]: 2025-09-30 07:51:59.123 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:59 compute-0 nova_compute[189265]: 2025-09-30 07:51:59.124 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:51:59 compute-0 nova_compute[189265]: 2025-09-30 07:51:59.127 2 INFO os_vif [None req-8701a489-5a6b-4bd1-ab3f-bc8163f7ff08 f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:46:c1:55,bridge_name='br-int',has_traffic_filtering=True,id=e73e78a1-01c0-43c0-b191-97cbd6fd06aa,network=Network(fe7b13d3-f2aa-4dfd-a4fa-cc5f0ed5e32f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape73e78a1-01')
Sep 30 07:51:59 compute-0 nova_compute[189265]: 2025-09-30 07:51:59.128 2 INFO nova.virt.libvirt.driver [None req-8701a489-5a6b-4bd1-ab3f-bc8163f7ff08 f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] [instance: 58023b4a-dc78-4f0b-a216-57b512b9561c] Deleting instance files /var/lib/nova/instances/58023b4a-dc78-4f0b-a216-57b512b9561c_del
Sep 30 07:51:59 compute-0 nova_compute[189265]: 2025-09-30 07:51:59.128 2 INFO nova.virt.libvirt.driver [None req-8701a489-5a6b-4bd1-ab3f-bc8163f7ff08 f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] [instance: 58023b4a-dc78-4f0b-a216-57b512b9561c] Deletion of /var/lib/nova/instances/58023b4a-dc78-4f0b-a216-57b512b9561c_del complete
Sep 30 07:51:59 compute-0 nova_compute[189265]: 2025-09-30 07:51:59.642 2 INFO nova.compute.manager [None req-8701a489-5a6b-4bd1-ab3f-bc8163f7ff08 f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] [instance: 58023b4a-dc78-4f0b-a216-57b512b9561c] Took 1.31 seconds to destroy the instance on the hypervisor.
Sep 30 07:51:59 compute-0 nova_compute[189265]: 2025-09-30 07:51:59.643 2 DEBUG oslo.service.backend._eventlet.loopingcall [None req-8701a489-5a6b-4bd1-ab3f-bc8163f7ff08 f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.12/site-packages/oslo_service/backend/_eventlet/loopingcall.py:437
Sep 30 07:51:59 compute-0 nova_compute[189265]: 2025-09-30 07:51:59.643 2 DEBUG nova.compute.manager [-] [instance: 58023b4a-dc78-4f0b-a216-57b512b9561c] Deallocating network for instance _deallocate_network /usr/lib/python3.12/site-packages/nova/compute/manager.py:2324
Sep 30 07:51:59 compute-0 nova_compute[189265]: 2025-09-30 07:51:59.644 2 DEBUG nova.network.neutron [-] [instance: 58023b4a-dc78-4f0b-a216-57b512b9561c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.12/site-packages/nova/network/neutron.py:1863
Sep 30 07:51:59 compute-0 nova_compute[189265]: 2025-09-30 07:51:59.644 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:51:59 compute-0 podman[199733]: time="2025-09-30T07:51:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:51:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:51:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:51:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:51:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3015 "" "Go-http-client/1.1"
Sep 30 07:52:00 compute-0 nova_compute[189265]: 2025-09-30 07:52:00.331 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:52:00 compute-0 nova_compute[189265]: 2025-09-30 07:52:00.554 2 WARNING neutronclient.v2_0.client [-] The python binding code in neutronclient is deprecated in favor of OpenstackSDK, please use that as this will be removed in a future release.
Sep 30 07:52:00 compute-0 nova_compute[189265]: 2025-09-30 07:52:00.568 2 DEBUG nova.compute.manager [req-ed249819-06a0-4f6d-a6b0-869dccce972a req-8e2afc76-c767-4e36-b2e8-03950b187f3f 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 58023b4a-dc78-4f0b-a216-57b512b9561c] Received event network-vif-unplugged-e73e78a1-01c0-43c0-b191-97cbd6fd06aa external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:52:00 compute-0 nova_compute[189265]: 2025-09-30 07:52:00.568 2 DEBUG oslo_concurrency.lockutils [req-ed249819-06a0-4f6d-a6b0-869dccce972a req-8e2afc76-c767-4e36-b2e8-03950b187f3f 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Acquiring lock "58023b4a-dc78-4f0b-a216-57b512b9561c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:52:00 compute-0 nova_compute[189265]: 2025-09-30 07:52:00.569 2 DEBUG oslo_concurrency.lockutils [req-ed249819-06a0-4f6d-a6b0-869dccce972a req-8e2afc76-c767-4e36-b2e8-03950b187f3f 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "58023b4a-dc78-4f0b-a216-57b512b9561c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:52:00 compute-0 nova_compute[189265]: 2025-09-30 07:52:00.569 2 DEBUG oslo_concurrency.lockutils [req-ed249819-06a0-4f6d-a6b0-869dccce972a req-8e2afc76-c767-4e36-b2e8-03950b187f3f 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] Lock "58023b4a-dc78-4f0b-a216-57b512b9561c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:52:00 compute-0 nova_compute[189265]: 2025-09-30 07:52:00.570 2 DEBUG nova.compute.manager [req-ed249819-06a0-4f6d-a6b0-869dccce972a req-8e2afc76-c767-4e36-b2e8-03950b187f3f 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 58023b4a-dc78-4f0b-a216-57b512b9561c] No waiting events found dispatching network-vif-unplugged-e73e78a1-01c0-43c0-b191-97cbd6fd06aa pop_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:344
Sep 30 07:52:00 compute-0 nova_compute[189265]: 2025-09-30 07:52:00.570 2 DEBUG nova.compute.manager [req-ed249819-06a0-4f6d-a6b0-869dccce972a req-8e2afc76-c767-4e36-b2e8-03950b187f3f 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 58023b4a-dc78-4f0b-a216-57b512b9561c] Received event network-vif-unplugged-e73e78a1-01c0-43c0-b191-97cbd6fd06aa for instance with task_state deleting. _process_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11590
Sep 30 07:52:00 compute-0 nova_compute[189265]: 2025-09-30 07:52:00.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:52:01 compute-0 openstack_network_exporter[201859]: ERROR   07:52:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:52:01 compute-0 openstack_network_exporter[201859]: ERROR   07:52:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:52:01 compute-0 openstack_network_exporter[201859]: ERROR   07:52:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:52:01 compute-0 openstack_network_exporter[201859]: ERROR   07:52:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:52:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:52:01 compute-0 openstack_network_exporter[201859]: ERROR   07:52:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:52:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:52:01 compute-0 nova_compute[189265]: 2025-09-30 07:52:01.670 2 DEBUG nova.compute.manager [req-cf5b15a9-0958-4a25-bea3-d6a88d169878 req-f4bfb62d-e1ac-4bfc-b0c4-e0b4026e7cec 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 58023b4a-dc78-4f0b-a216-57b512b9561c] Received event network-vif-deleted-e73e78a1-01c0-43c0-b191-97cbd6fd06aa external_instance_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11812
Sep 30 07:52:01 compute-0 nova_compute[189265]: 2025-09-30 07:52:01.670 2 INFO nova.compute.manager [req-cf5b15a9-0958-4a25-bea3-d6a88d169878 req-f4bfb62d-e1ac-4bfc-b0c4-e0b4026e7cec 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 58023b4a-dc78-4f0b-a216-57b512b9561c] Neutron deleted interface e73e78a1-01c0-43c0-b191-97cbd6fd06aa; detaching it from the instance and deleting it from the info cache
Sep 30 07:52:01 compute-0 nova_compute[189265]: 2025-09-30 07:52:01.670 2 DEBUG nova.network.neutron [req-cf5b15a9-0958-4a25-bea3-d6a88d169878 req-f4bfb62d-e1ac-4bfc-b0c4-e0b4026e7cec 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 58023b4a-dc78-4f0b-a216-57b512b9561c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:52:01 compute-0 nova_compute[189265]: 2025-09-30 07:52:01.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:52:02 compute-0 nova_compute[189265]: 2025-09-30 07:52:02.076 2 DEBUG nova.network.neutron [-] [instance: 58023b4a-dc78-4f0b-a216-57b512b9561c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.12/site-packages/nova/network/neutron.py:116
Sep 30 07:52:02 compute-0 nova_compute[189265]: 2025-09-30 07:52:02.177 2 DEBUG nova.compute.manager [req-cf5b15a9-0958-4a25-bea3-d6a88d169878 req-f4bfb62d-e1ac-4bfc-b0c4-e0b4026e7cec 7f7285f252c1442588e0ad5b6dd710ea c907a5861ae4441bb28195d4a66713bf - - default default] [instance: 58023b4a-dc78-4f0b-a216-57b512b9561c] Detach interface failed, port_id=e73e78a1-01c0-43c0-b191-97cbd6fd06aa, reason: Instance 58023b4a-dc78-4f0b-a216-57b512b9561c could not be found. _process_instance_vif_deleted_event /usr/lib/python3.12/site-packages/nova/compute/manager.py:11646
Sep 30 07:52:02 compute-0 nova_compute[189265]: 2025-09-30 07:52:02.582 2 INFO nova.compute.manager [-] [instance: 58023b4a-dc78-4f0b-a216-57b512b9561c] Took 2.94 seconds to deallocate network for instance.
Sep 30 07:52:02 compute-0 nova_compute[189265]: 2025-09-30 07:52:02.783 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:52:03 compute-0 nova_compute[189265]: 2025-09-30 07:52:03.104 2 DEBUG oslo_concurrency.lockutils [None req-8701a489-5a6b-4bd1-ab3f-bc8163f7ff08 f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:52:03 compute-0 nova_compute[189265]: 2025-09-30 07:52:03.104 2 DEBUG oslo_concurrency.lockutils [None req-8701a489-5a6b-4bd1-ab3f-bc8163f7ff08 f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:52:03 compute-0 nova_compute[189265]: 2025-09-30 07:52:03.111 2 DEBUG oslo_concurrency.lockutils [None req-8701a489-5a6b-4bd1-ab3f-bc8163f7ff08 f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.007s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:52:03 compute-0 nova_compute[189265]: 2025-09-30 07:52:03.156 2 INFO nova.scheduler.client.report [None req-8701a489-5a6b-4bd1-ab3f-bc8163f7ff08 f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Deleted allocations for instance 58023b4a-dc78-4f0b-a216-57b512b9561c
Sep 30 07:52:04 compute-0 nova_compute[189265]: 2025-09-30 07:52:04.123 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:52:04 compute-0 nova_compute[189265]: 2025-09-30 07:52:04.185 2 DEBUG oslo_concurrency.lockutils [None req-8701a489-5a6b-4bd1-ab3f-bc8163f7ff08 f267aaead8a3437a8359b21224982b1c 2d470809703a44e69c2bc0d283b2bce4 - - default default] Lock "58023b4a-dc78-4f0b-a216-57b512b9561c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.395s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:52:05 compute-0 nova_compute[189265]: 2025-09-30 07:52:05.333 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:52:05 compute-0 podman[227386]: 2025-09-30 07:52:05.48209603 +0000 UTC m=+0.063027039 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 07:52:08 compute-0 nova_compute[189265]: 2025-09-30 07:52:08.180 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:52:08 compute-0 nova_compute[189265]: 2025-09-30 07:52:08.794 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:52:08 compute-0 nova_compute[189265]: 2025-09-30 07:52:08.794 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:52:09 compute-0 nova_compute[189265]: 2025-09-30 07:52:09.127 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:52:09 compute-0 nova_compute[189265]: 2025-09-30 07:52:09.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:52:09 compute-0 nova_compute[189265]: 2025-09-30 07:52:09.789 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:52:10 compute-0 nova_compute[189265]: 2025-09-30 07:52:10.305 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:52:10 compute-0 nova_compute[189265]: 2025-09-30 07:52:10.305 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:52:10 compute-0 nova_compute[189265]: 2025-09-30 07:52:10.305 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:52:10 compute-0 nova_compute[189265]: 2025-09-30 07:52:10.306 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:52:10 compute-0 nova_compute[189265]: 2025-09-30 07:52:10.336 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:52:10 compute-0 nova_compute[189265]: 2025-09-30 07:52:10.537 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:52:10 compute-0 nova_compute[189265]: 2025-09-30 07:52:10.539 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:52:10 compute-0 nova_compute[189265]: 2025-09-30 07:52:10.573 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.033s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:52:10 compute-0 nova_compute[189265]: 2025-09-30 07:52:10.574 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5826MB free_disk=73.3033218383789GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:52:10 compute-0 nova_compute[189265]: 2025-09-30 07:52:10.574 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:52:10 compute-0 nova_compute[189265]: 2025-09-30 07:52:10.575 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:52:11 compute-0 sshd-session[227411]: Invalid user reelforge from 103.57.223.153 port 34570
Sep 30 07:52:11 compute-0 sshd-session[227411]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:52:11 compute-0 sshd-session[227411]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.57.223.153
Sep 30 07:52:11 compute-0 nova_compute[189265]: 2025-09-30 07:52:11.639 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:52:11 compute-0 nova_compute[189265]: 2025-09-30 07:52:11.640 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:52:10 up  1:49,  0 user,  load average: 0.42, 0.26, 0.27\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:52:11 compute-0 nova_compute[189265]: 2025-09-30 07:52:11.684 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:52:12 compute-0 nova_compute[189265]: 2025-09-30 07:52:12.194 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:52:12 compute-0 nova_compute[189265]: 2025-09-30 07:52:12.709 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:52:12 compute-0 nova_compute[189265]: 2025-09-30 07:52:12.710 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.135s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:52:13 compute-0 sshd-session[227411]: Failed password for invalid user reelforge from 103.57.223.153 port 34570 ssh2
Sep 30 07:52:14 compute-0 sshd-session[227411]: Received disconnect from 103.57.223.153 port 34570:11: Bye Bye [preauth]
Sep 30 07:52:14 compute-0 sshd-session[227411]: Disconnected from invalid user reelforge 103.57.223.153 port 34570 [preauth]
Sep 30 07:52:14 compute-0 nova_compute[189265]: 2025-09-30 07:52:14.130 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:52:15 compute-0 nova_compute[189265]: 2025-09-30 07:52:15.338 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:52:15 compute-0 nova_compute[189265]: 2025-09-30 07:52:15.710 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:52:16 compute-0 podman[227416]: 2025-09-30 07:52:16.513863077 +0000 UTC m=+0.090593654 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 07:52:19 compute-0 nova_compute[189265]: 2025-09-30 07:52:19.134 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:52:20 compute-0 nova_compute[189265]: 2025-09-30 07:52:20.340 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:52:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:52:20.603 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:52:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:52:20.603 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:52:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:52:20.603 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:52:20 compute-0 nova_compute[189265]: 2025-09-30 07:52:20.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:52:21 compute-0 podman[227438]: 2025-09-30 07:52:21.523924142 +0000 UTC m=+0.097268157 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vendor=Red Hat, Inc., distribution-scope=public, managed_by=edpm_ansible, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, container_name=openstack_network_exporter)
Sep 30 07:52:24 compute-0 nova_compute[189265]: 2025-09-30 07:52:24.135 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:52:24 compute-0 podman[227460]: 2025-09-30 07:52:24.498470812 +0000 UTC m=+0.082493221 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Sep 30 07:52:24 compute-0 podman[227461]: 2025-09-30 07:52:24.513437963 +0000 UTC m=+0.085640081 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Sep 30 07:52:24 compute-0 podman[227462]: 2025-09-30 07:52:24.541243565 +0000 UTC m=+0.118156049 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, managed_by=edpm_ansible)
Sep 30 07:52:25 compute-0 nova_compute[189265]: 2025-09-30 07:52:25.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:52:27 compute-0 sshd-session[227525]: Connection closed by 152.32.144.167 port 50194 [preauth]
Sep 30 07:52:27 compute-0 nova_compute[189265]: 2025-09-30 07:52:27.784 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:52:29 compute-0 nova_compute[189265]: 2025-09-30 07:52:29.138 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:52:29 compute-0 podman[199733]: time="2025-09-30T07:52:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:52:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:52:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:52:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:52:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3019 "" "Go-http-client/1.1"
Sep 30 07:52:30 compute-0 nova_compute[189265]: 2025-09-30 07:52:30.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:52:31 compute-0 openstack_network_exporter[201859]: ERROR   07:52:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:52:31 compute-0 openstack_network_exporter[201859]: ERROR   07:52:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:52:31 compute-0 openstack_network_exporter[201859]: ERROR   07:52:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:52:31 compute-0 openstack_network_exporter[201859]: ERROR   07:52:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:52:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:52:31 compute-0 openstack_network_exporter[201859]: ERROR   07:52:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:52:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:52:32 compute-0 sshd-session[227527]: Invalid user student from 159.89.22.242 port 54038
Sep 30 07:52:32 compute-0 sshd-session[227527]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:52:32 compute-0 sshd-session[227527]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=159.89.22.242
Sep 30 07:52:34 compute-0 nova_compute[189265]: 2025-09-30 07:52:34.139 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:52:35 compute-0 sshd-session[227527]: Failed password for invalid user student from 159.89.22.242 port 54038 ssh2
Sep 30 07:52:35 compute-0 nova_compute[189265]: 2025-09-30 07:52:35.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:52:36 compute-0 sshd-session[227527]: Received disconnect from 159.89.22.242 port 54038:11: Bye Bye [preauth]
Sep 30 07:52:36 compute-0 sshd-session[227527]: Disconnected from invalid user student 159.89.22.242 port 54038 [preauth]
Sep 30 07:52:36 compute-0 podman[227529]: 2025-09-30 07:52:36.492233364 +0000 UTC m=+0.066352955 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 07:52:39 compute-0 nova_compute[189265]: 2025-09-30 07:52:39.141 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:52:40 compute-0 nova_compute[189265]: 2025-09-30 07:52:40.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:52:44 compute-0 nova_compute[189265]: 2025-09-30 07:52:44.190 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:52:45 compute-0 nova_compute[189265]: 2025-09-30 07:52:45.350 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:52:46 compute-0 ovn_controller[91436]: 2025-09-30T07:52:46Z|00311|memory_trim|INFO|Detected inactivity (last active 30015 ms ago): trimming memory
Sep 30 07:52:47 compute-0 podman[227553]: 2025-09-30 07:52:47.484038268 +0000 UTC m=+0.069742613 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=iscsid, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0)
Sep 30 07:52:49 compute-0 nova_compute[189265]: 2025-09-30 07:52:49.191 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:52:50 compute-0 nova_compute[189265]: 2025-09-30 07:52:50.353 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:52:52 compute-0 podman[227574]: 2025-09-30 07:52:52.509713122 +0000 UTC m=+0.085648932 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.6, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, architecture=x86_64, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.openshift.expose-services=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Sep 30 07:52:54 compute-0 nova_compute[189265]: 2025-09-30 07:52:54.193 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:52:55 compute-0 nova_compute[189265]: 2025-09-30 07:52:55.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:52:55 compute-0 podman[227597]: 2025-09-30 07:52:55.491657985 +0000 UTC m=+0.065095989 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, tcib_managed=true, config_id=ovn_metadata_agent)
Sep 30 07:52:55 compute-0 podman[227596]: 2025-09-30 07:52:55.493640362 +0000 UTC m=+0.067998492 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_managed=true, container_name=multipathd, org.label-schema.schema-version=1.0, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250930)
Sep 30 07:52:55 compute-0 podman[227598]: 2025-09-30 07:52:55.563672692 +0000 UTC m=+0.123839023 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0)
Sep 30 07:52:59 compute-0 nova_compute[189265]: 2025-09-30 07:52:59.195 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:52:59 compute-0 podman[199733]: time="2025-09-30T07:52:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:52:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:52:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:52:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:52:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3017 "" "Go-http-client/1.1"
Sep 30 07:53:00 compute-0 nova_compute[189265]: 2025-09-30 07:53:00.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:53:01 compute-0 openstack_network_exporter[201859]: ERROR   07:53:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:53:01 compute-0 openstack_network_exporter[201859]: ERROR   07:53:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:53:01 compute-0 openstack_network_exporter[201859]: ERROR   07:53:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:53:01 compute-0 openstack_network_exporter[201859]: ERROR   07:53:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:53:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:53:01 compute-0 openstack_network_exporter[201859]: ERROR   07:53:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:53:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:53:02 compute-0 nova_compute[189265]: 2025-09-30 07:53:02.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:53:02 compute-0 nova_compute[189265]: 2025-09-30 07:53:02.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:53:02 compute-0 nova_compute[189265]: 2025-09-30 07:53:02.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:53:04 compute-0 nova_compute[189265]: 2025-09-30 07:53:04.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:53:05 compute-0 nova_compute[189265]: 2025-09-30 07:53:05.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:53:07 compute-0 podman[227656]: 2025-09-30 07:53:07.50665823 +0000 UTC m=+0.078406212 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 07:53:09 compute-0 nova_compute[189265]: 2025-09-30 07:53:09.199 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:53:09 compute-0 sshd-session[227679]: Accepted publickey for zuul from 192.168.122.10 port 45346 ssh2: ECDSA SHA256:VgXY+3KEFg6ByVjpOVk/qpSKqXtLqTtx1W0gQMfs9wE
Sep 30 07:53:09 compute-0 systemd-logind[824]: New session 35 of user zuul.
Sep 30 07:53:09 compute-0 systemd[1]: Started Session 35 of User zuul.
Sep 30 07:53:09 compute-0 sshd-session[227679]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 07:53:09 compute-0 sudo[227683]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp -p container,openstack_edpm,system,storage,virt'
Sep 30 07:53:09 compute-0 sudo[227683]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:53:09 compute-0 nova_compute[189265]: 2025-09-30 07:53:09.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:53:10 compute-0 nova_compute[189265]: 2025-09-30 07:53:10.303 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:53:10 compute-0 nova_compute[189265]: 2025-09-30 07:53:10.304 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:53:10 compute-0 nova_compute[189265]: 2025-09-30 07:53:10.304 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:53:10 compute-0 nova_compute[189265]: 2025-09-30 07:53:10.304 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:53:10 compute-0 nova_compute[189265]: 2025-09-30 07:53:10.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:53:10 compute-0 nova_compute[189265]: 2025-09-30 07:53:10.484 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:53:10 compute-0 nova_compute[189265]: 2025-09-30 07:53:10.485 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:53:10 compute-0 nova_compute[189265]: 2025-09-30 07:53:10.528 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.043s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:53:10 compute-0 nova_compute[189265]: 2025-09-30 07:53:10.529 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5815MB free_disk=73.29555130004883GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:53:10 compute-0 nova_compute[189265]: 2025-09-30 07:53:10.529 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:53:10 compute-0 nova_compute[189265]: 2025-09-30 07:53:10.530 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:53:11 compute-0 nova_compute[189265]: 2025-09-30 07:53:11.591 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:53:11 compute-0 nova_compute[189265]: 2025-09-30 07:53:11.592 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:53:10 up  1:50,  0 user,  load average: 0.15, 0.21, 0.25\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:53:11 compute-0 nova_compute[189265]: 2025-09-30 07:53:11.620 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:53:12 compute-0 nova_compute[189265]: 2025-09-30 07:53:12.128 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:53:12 compute-0 nova_compute[189265]: 2025-09-30 07:53:12.640 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:53:12 compute-0 nova_compute[189265]: 2025-09-30 07:53:12.641 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.111s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:53:13 compute-0 nova_compute[189265]: 2025-09-30 07:53:13.641 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:53:13 compute-0 nova_compute[189265]: 2025-09-30 07:53:13.642 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:53:13 compute-0 nova_compute[189265]: 2025-09-30 07:53:13.642 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:53:13 compute-0 nova_compute[189265]: 2025-09-30 07:53:13.642 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:53:14 compute-0 nova_compute[189265]: 2025-09-30 07:53:14.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:53:15 compute-0 nova_compute[189265]: 2025-09-30 07:53:15.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:53:15 compute-0 ovs-vsctl[227860]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Sep 30 07:53:16 compute-0 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 227707 (sos)
Sep 30 07:53:16 compute-0 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Sep 30 07:53:16 compute-0 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Sep 30 07:53:16 compute-0 virtqemud[189090]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Sep 30 07:53:16 compute-0 virtqemud[189090]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Sep 30 07:53:16 compute-0 virtqemud[189090]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Sep 30 07:53:17 compute-0 kernel: block vda: the capability attribute has been deprecated.
Sep 30 07:53:17 compute-0 crontab[228278]: (root) LIST (root)
Sep 30 07:53:18 compute-0 podman[228335]: 2025-09-30 07:53:18.49000506 +0000 UTC m=+0.066595651 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250930, tcib_managed=true)
Sep 30 07:53:19 compute-0 nova_compute[189265]: 2025-09-30 07:53:19.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:53:19 compute-0 systemd[1]: Starting Hostname Service...
Sep 30 07:53:19 compute-0 systemd[1]: Started Hostname Service.
Sep 30 07:53:20 compute-0 nova_compute[189265]: 2025-09-30 07:53:20.365 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:53:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:53:20.604 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:53:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:53:20.604 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:53:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:53:20.604 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:53:21 compute-0 nova_compute[189265]: 2025-09-30 07:53:21.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:53:23 compute-0 podman[228667]: 2025-09-30 07:53:23.482603891 +0000 UTC m=+0.060301120 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., name=ubi9-minimal, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_id=edpm, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=)
Sep 30 07:53:24 compute-0 nova_compute[189265]: 2025-09-30 07:53:24.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:53:24 compute-0 unix_chkpwd[228887]: password check failed for user (root)
Sep 30 07:53:24 compute-0 sshd-session[228619]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.57.223.153  user=root
Sep 30 07:53:25 compute-0 nova_compute[189265]: 2025-09-30 07:53:25.366 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:53:25 compute-0 ovs-appctl[229421]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Sep 30 07:53:25 compute-0 ovs-appctl[229427]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Sep 30 07:53:25 compute-0 ovs-appctl[229443]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Sep 30 07:53:26 compute-0 sshd-session[228619]: Failed password for root from 103.57.223.153 port 36880 ssh2
Sep 30 07:53:26 compute-0 podman[229677]: 2025-09-30 07:53:26.484451198 +0000 UTC m=+0.067244190 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20250930)
Sep 30 07:53:26 compute-0 podman[229673]: 2025-09-30 07:53:26.510799148 +0000 UTC m=+0.097726850 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest)
Sep 30 07:53:26 compute-0 podman[229679]: 2025-09-30 07:53:26.512507287 +0000 UTC m=+0.086937518 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0)
Sep 30 07:53:26 compute-0 sshd-session[228619]: Received disconnect from 103.57.223.153 port 36880:11: Bye Bye [preauth]
Sep 30 07:53:26 compute-0 sshd-session[228619]: Disconnected from authenticating user root 103.57.223.153 port 36880 [preauth]
Sep 30 07:53:29 compute-0 nova_compute[189265]: 2025-09-30 07:53:29.206 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:53:29 compute-0 podman[199733]: time="2025-09-30T07:53:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:53:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:53:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:53:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:53:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3019 "" "Go-http-client/1.1"
Sep 30 07:53:30 compute-0 nova_compute[189265]: 2025-09-30 07:53:30.409 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:53:31 compute-0 openstack_network_exporter[201859]: ERROR   07:53:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:53:31 compute-0 openstack_network_exporter[201859]: ERROR   07:53:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:53:31 compute-0 openstack_network_exporter[201859]: ERROR   07:53:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:53:31 compute-0 openstack_network_exporter[201859]: ERROR   07:53:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:53:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:53:31 compute-0 openstack_network_exporter[201859]: ERROR   07:53:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:53:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:53:31 compute-0 sshd-session[230427]: Invalid user gitlab-runner from 159.89.22.242 port 58466
Sep 30 07:53:31 compute-0 sshd-session[230427]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:53:31 compute-0 sshd-session[230427]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=159.89.22.242
Sep 30 07:53:33 compute-0 virtqemud[189090]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Sep 30 07:53:33 compute-0 sshd-session[230427]: Failed password for invalid user gitlab-runner from 159.89.22.242 port 58466 ssh2
Sep 30 07:53:34 compute-0 nova_compute[189265]: 2025-09-30 07:53:34.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:53:35 compute-0 sshd-session[230427]: Received disconnect from 159.89.22.242 port 58466:11: Bye Bye [preauth]
Sep 30 07:53:35 compute-0 sshd-session[230427]: Disconnected from invalid user gitlab-runner 159.89.22.242 port 58466 [preauth]
Sep 30 07:53:35 compute-0 nova_compute[189265]: 2025-09-30 07:53:35.409 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:53:35 compute-0 systemd[1]: Starting Time & Date Service...
Sep 30 07:53:35 compute-0 systemd[1]: Started Time & Date Service.
Sep 30 07:53:37 compute-0 systemd[1]: virtproxyd.service: Deactivated successfully.
Sep 30 07:53:38 compute-0 podman[230918]: 2025-09-30 07:53:38.044133449 +0000 UTC m=+0.087258157 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 07:53:39 compute-0 nova_compute[189265]: 2025-09-30 07:53:39.212 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:53:40 compute-0 nova_compute[189265]: 2025-09-30 07:53:40.412 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:53:40 compute-0 sshd-session[230944]: Invalid user iptv from 152.32.144.167 port 49324
Sep 30 07:53:40 compute-0 sshd-session[230944]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:53:40 compute-0 sshd-session[230944]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=152.32.144.167
Sep 30 07:53:42 compute-0 sshd-session[230944]: Failed password for invalid user iptv from 152.32.144.167 port 49324 ssh2
Sep 30 07:53:44 compute-0 nova_compute[189265]: 2025-09-30 07:53:44.214 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:53:44 compute-0 sshd-session[230944]: Received disconnect from 152.32.144.167 port 49324:11: Bye Bye [preauth]
Sep 30 07:53:44 compute-0 sshd-session[230944]: Disconnected from invalid user iptv 152.32.144.167 port 49324 [preauth]
Sep 30 07:53:45 compute-0 nova_compute[189265]: 2025-09-30 07:53:45.416 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:53:49 compute-0 nova_compute[189265]: 2025-09-30 07:53:49.215 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:53:49 compute-0 podman[230946]: 2025-09-30 07:53:49.337276985 +0000 UTC m=+0.064333977 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, container_name=iscsid, io.buildah.version=1.41.4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team)
Sep 30 07:53:50 compute-0 nova_compute[189265]: 2025-09-30 07:53:50.418 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:53:53 compute-0 sudo[227683]: pam_unix(sudo:session): session closed for user root
Sep 30 07:53:53 compute-0 sshd-session[227682]: Received disconnect from 192.168.122.10 port 45346:11: disconnected by user
Sep 30 07:53:53 compute-0 sshd-session[227682]: Disconnected from user zuul 192.168.122.10 port 45346
Sep 30 07:53:53 compute-0 sshd-session[227679]: pam_unix(sshd:session): session closed for user zuul
Sep 30 07:53:53 compute-0 systemd[1]: session-35.scope: Deactivated successfully.
Sep 30 07:53:53 compute-0 systemd[1]: session-35.scope: Consumed 1min 12.228s CPU time, 528.0M memory peak, read 124.4M from disk, written 18.0M to disk.
Sep 30 07:53:53 compute-0 systemd-logind[824]: Session 35 logged out. Waiting for processes to exit.
Sep 30 07:53:53 compute-0 systemd-logind[824]: Removed session 35.
Sep 30 07:53:53 compute-0 podman[230966]: 2025-09-30 07:53:53.802776104 +0000 UTC m=+0.085349373 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, version=9.6, distribution-scope=public, name=ubi9-minimal, io.openshift.expose-services=, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, vendor=Red Hat, Inc., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter)
Sep 30 07:53:53 compute-0 sshd-session[230967]: Accepted publickey for zuul from 192.168.122.10 port 55124 ssh2: ECDSA SHA256:VgXY+3KEFg6ByVjpOVk/qpSKqXtLqTtx1W0gQMfs9wE
Sep 30 07:53:53 compute-0 systemd-logind[824]: New session 36 of user zuul.
Sep 30 07:53:53 compute-0 systemd[1]: Started Session 36 of User zuul.
Sep 30 07:53:53 compute-0 sshd-session[230967]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 07:53:53 compute-0 sudo[230992]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/cat /var/tmp/sos-osp/sosreport-compute-0-2025-09-30-wbblffk.tar.xz
Sep 30 07:53:53 compute-0 sudo[230992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:53:53 compute-0 sudo[230992]: pam_unix(sudo:session): session closed for user root
Sep 30 07:53:53 compute-0 sshd-session[230991]: Received disconnect from 192.168.122.10 port 55124:11: disconnected by user
Sep 30 07:53:53 compute-0 sshd-session[230991]: Disconnected from user zuul 192.168.122.10 port 55124
Sep 30 07:53:54 compute-0 sshd-session[230967]: pam_unix(sshd:session): session closed for user zuul
Sep 30 07:53:54 compute-0 systemd[1]: session-36.scope: Deactivated successfully.
Sep 30 07:53:54 compute-0 systemd-logind[824]: Session 36 logged out. Waiting for processes to exit.
Sep 30 07:53:54 compute-0 systemd-logind[824]: Removed session 36.
Sep 30 07:53:54 compute-0 sshd-session[231017]: Accepted publickey for zuul from 192.168.122.10 port 55140 ssh2: ECDSA SHA256:VgXY+3KEFg6ByVjpOVk/qpSKqXtLqTtx1W0gQMfs9wE
Sep 30 07:53:54 compute-0 systemd-logind[824]: New session 37 of user zuul.
Sep 30 07:53:54 compute-0 systemd[1]: Started Session 37 of User zuul.
Sep 30 07:53:54 compute-0 sshd-session[231017]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 07:53:54 compute-0 nova_compute[189265]: 2025-09-30 07:53:54.218 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:53:54 compute-0 sudo[231021]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rm -rf /var/tmp/sos-osp
Sep 30 07:53:54 compute-0 sudo[231021]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 07:53:54 compute-0 sudo[231021]: pam_unix(sudo:session): session closed for user root
Sep 30 07:53:54 compute-0 sshd-session[231020]: Received disconnect from 192.168.122.10 port 55140:11: disconnected by user
Sep 30 07:53:54 compute-0 sshd-session[231020]: Disconnected from user zuul 192.168.122.10 port 55140
Sep 30 07:53:54 compute-0 sshd-session[231017]: pam_unix(sshd:session): session closed for user zuul
Sep 30 07:53:54 compute-0 systemd[1]: session-37.scope: Deactivated successfully.
Sep 30 07:53:54 compute-0 systemd-logind[824]: Session 37 logged out. Waiting for processes to exit.
Sep 30 07:53:54 compute-0 systemd-logind[824]: Removed session 37.
Sep 30 07:53:55 compute-0 nova_compute[189265]: 2025-09-30 07:53:55.421 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:53:57 compute-0 podman[231046]: 2025-09-30 07:53:57.476462347 +0000 UTC m=+0.055868383 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd)
Sep 30 07:53:57 compute-0 podman[231047]: 2025-09-30 07:53:57.476515988 +0000 UTC m=+0.056679606 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Sep 30 07:53:57 compute-0 podman[231048]: 2025-09-30 07:53:57.54140303 +0000 UTC m=+0.120866578 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 07:53:59 compute-0 nova_compute[189265]: 2025-09-30 07:53:59.220 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:53:59 compute-0 podman[199733]: time="2025-09-30T07:53:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:53:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:53:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:53:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:53:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3019 "" "Go-http-client/1.1"
Sep 30 07:54:00 compute-0 nova_compute[189265]: 2025-09-30 07:54:00.420 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:54:01 compute-0 openstack_network_exporter[201859]: ERROR   07:54:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:54:01 compute-0 openstack_network_exporter[201859]: ERROR   07:54:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:54:01 compute-0 openstack_network_exporter[201859]: ERROR   07:54:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:54:01 compute-0 openstack_network_exporter[201859]: ERROR   07:54:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:54:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:54:01 compute-0 openstack_network_exporter[201859]: ERROR   07:54:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:54:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:54:03 compute-0 nova_compute[189265]: 2025-09-30 07:54:03.783 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:54:03 compute-0 nova_compute[189265]: 2025-09-30 07:54:03.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:54:03 compute-0 nova_compute[189265]: 2025-09-30 07:54:03.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:54:04 compute-0 nova_compute[189265]: 2025-09-30 07:54:04.221 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:54:05 compute-0 nova_compute[189265]: 2025-09-30 07:54:05.423 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:54:05 compute-0 systemd[1]: systemd-timedated.service: Deactivated successfully.
Sep 30 07:54:05 compute-0 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Sep 30 07:54:08 compute-0 podman[231112]: 2025-09-30 07:54:08.49110385 +0000 UTC m=+0.069808165 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 07:54:09 compute-0 nova_compute[189265]: 2025-09-30 07:54:09.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:54:09 compute-0 nova_compute[189265]: 2025-09-30 07:54:09.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:54:10 compute-0 nova_compute[189265]: 2025-09-30 07:54:10.306 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:54:10 compute-0 nova_compute[189265]: 2025-09-30 07:54:10.307 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:54:10 compute-0 nova_compute[189265]: 2025-09-30 07:54:10.307 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:54:10 compute-0 nova_compute[189265]: 2025-09-30 07:54:10.307 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:54:10 compute-0 nova_compute[189265]: 2025-09-30 07:54:10.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:54:10 compute-0 nova_compute[189265]: 2025-09-30 07:54:10.521 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:54:10 compute-0 nova_compute[189265]: 2025-09-30 07:54:10.522 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:54:10 compute-0 nova_compute[189265]: 2025-09-30 07:54:10.560 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.039s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:54:10 compute-0 nova_compute[189265]: 2025-09-30 07:54:10.561 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5722MB free_disk=73.29512786865234GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:54:10 compute-0 nova_compute[189265]: 2025-09-30 07:54:10.562 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:54:10 compute-0 nova_compute[189265]: 2025-09-30 07:54:10.562 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:54:11 compute-0 nova_compute[189265]: 2025-09-30 07:54:11.628 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:54:11 compute-0 nova_compute[189265]: 2025-09-30 07:54:11.629 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:54:10 up  1:51,  0 user,  load average: 0.84, 0.44, 0.33\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:54:11 compute-0 nova_compute[189265]: 2025-09-30 07:54:11.668 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:54:12 compute-0 nova_compute[189265]: 2025-09-30 07:54:12.178 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:54:12 compute-0 nova_compute[189265]: 2025-09-30 07:54:12.691 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:54:12 compute-0 nova_compute[189265]: 2025-09-30 07:54:12.691 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.130s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:54:14 compute-0 nova_compute[189265]: 2025-09-30 07:54:14.225 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:54:15 compute-0 nova_compute[189265]: 2025-09-30 07:54:15.425 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:54:15 compute-0 nova_compute[189265]: 2025-09-30 07:54:15.691 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:54:15 compute-0 nova_compute[189265]: 2025-09-30 07:54:15.692 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:54:15 compute-0 nova_compute[189265]: 2025-09-30 07:54:15.692 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:54:15 compute-0 nova_compute[189265]: 2025-09-30 07:54:15.693 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:54:19 compute-0 nova_compute[189265]: 2025-09-30 07:54:19.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:54:19 compute-0 podman[231138]: 2025-09-30 07:54:19.484314084 +0000 UTC m=+0.074885891 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=iscsid, container_name=iscsid, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.build-date=20250930)
Sep 30 07:54:20 compute-0 nova_compute[189265]: 2025-09-30 07:54:20.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:54:20.606 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:54:20.606 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:54:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:54:20.606 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:54:21 compute-0 nova_compute[189265]: 2025-09-30 07:54:21.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:54:23 compute-0 sshd-session[231159]: Invalid user daniil from 159.89.22.242 port 37590
Sep 30 07:54:23 compute-0 sshd-session[231159]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:54:23 compute-0 sshd-session[231159]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=159.89.22.242
Sep 30 07:54:23 compute-0 nova_compute[189265]: 2025-09-30 07:54:23.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:54:24 compute-0 nova_compute[189265]: 2025-09-30 07:54:24.229 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:54:24 compute-0 podman[231161]: 2025-09-30 07:54:24.522107848 +0000 UTC m=+0.097495693 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, build-date=2025-08-20T13:12:41, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., release=1755695350, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter)
Sep 30 07:54:25 compute-0 nova_compute[189265]: 2025-09-30 07:54:25.429 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:54:25 compute-0 sshd-session[231159]: Failed password for invalid user daniil from 159.89.22.242 port 37590 ssh2
Sep 30 07:54:26 compute-0 sshd-session[231159]: Received disconnect from 159.89.22.242 port 37590:11: Bye Bye [preauth]
Sep 30 07:54:26 compute-0 sshd-session[231159]: Disconnected from invalid user daniil 159.89.22.242 port 37590 [preauth]
Sep 30 07:54:27 compute-0 nova_compute[189265]: 2025-09-30 07:54:27.296 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:54:27 compute-0 nova_compute[189265]: 2025-09-30 07:54:27.297 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Sep 30 07:54:28 compute-0 podman[231183]: 2025-09-30 07:54:28.521764222 +0000 UTC m=+0.088662538 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Sep 30 07:54:28 compute-0 podman[231182]: 2025-09-30 07:54:28.533258034 +0000 UTC m=+0.106472832 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 07:54:28 compute-0 podman[231184]: 2025-09-30 07:54:28.546450254 +0000 UTC m=+0.116260664 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.4, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Sep 30 07:54:29 compute-0 nova_compute[189265]: 2025-09-30 07:54:29.230 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:54:29 compute-0 podman[199733]: time="2025-09-30T07:54:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:54:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:54:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:54:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:54:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3015 "" "Go-http-client/1.1"
Sep 30 07:54:30 compute-0 nova_compute[189265]: 2025-09-30 07:54:30.431 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:54:31 compute-0 nova_compute[189265]: 2025-09-30 07:54:31.290 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:54:31 compute-0 openstack_network_exporter[201859]: ERROR   07:54:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:54:31 compute-0 openstack_network_exporter[201859]: ERROR   07:54:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:54:31 compute-0 openstack_network_exporter[201859]: ERROR   07:54:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:54:31 compute-0 openstack_network_exporter[201859]: ERROR   07:54:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:54:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:54:31 compute-0 openstack_network_exporter[201859]: ERROR   07:54:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:54:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:54:34 compute-0 nova_compute[189265]: 2025-09-30 07:54:34.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:54:35 compute-0 nova_compute[189265]: 2025-09-30 07:54:35.433 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:54:35 compute-0 sshd-session[231248]: Invalid user test from 103.57.223.153 port 35484
Sep 30 07:54:35 compute-0 sshd-session[231248]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:54:35 compute-0 sshd-session[231248]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.57.223.153
Sep 30 07:54:37 compute-0 sshd-session[231248]: Failed password for invalid user test from 103.57.223.153 port 35484 ssh2
Sep 30 07:54:38 compute-0 sshd-session[231248]: Received disconnect from 103.57.223.153 port 35484:11: Bye Bye [preauth]
Sep 30 07:54:38 compute-0 sshd-session[231248]: Disconnected from invalid user test 103.57.223.153 port 35484 [preauth]
Sep 30 07:54:39 compute-0 nova_compute[189265]: 2025-09-30 07:54:39.234 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:54:39 compute-0 podman[231250]: 2025-09-30 07:54:39.5310292 +0000 UTC m=+0.099042878 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 07:54:39 compute-0 nova_compute[189265]: 2025-09-30 07:54:39.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:54:39 compute-0 nova_compute[189265]: 2025-09-30 07:54:39.788 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Sep 30 07:54:40 compute-0 nova_compute[189265]: 2025-09-30 07:54:40.298 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Sep 30 07:54:40 compute-0 nova_compute[189265]: 2025-09-30 07:54:40.437 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:54:44 compute-0 nova_compute[189265]: 2025-09-30 07:54:44.236 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:54:45 compute-0 nova_compute[189265]: 2025-09-30 07:54:45.437 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:54:48 compute-0 sshd-session[231275]: Invalid user server from 152.32.144.167 port 58260
Sep 30 07:54:48 compute-0 sshd-session[231275]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:54:48 compute-0 sshd-session[231275]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=152.32.144.167
Sep 30 07:54:49 compute-0 nova_compute[189265]: 2025-09-30 07:54:49.238 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:54:50 compute-0 nova_compute[189265]: 2025-09-30 07:54:50.439 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:54:50 compute-0 podman[231277]: 2025-09-30 07:54:50.511818267 +0000 UTC m=+0.086283040 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.vendor=CentOS, config_id=iscsid, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0)
Sep 30 07:54:50 compute-0 sshd-session[231275]: Failed password for invalid user server from 152.32.144.167 port 58260 ssh2
Sep 30 07:54:51 compute-0 sshd-session[231275]: Received disconnect from 152.32.144.167 port 58260:11: Bye Bye [preauth]
Sep 30 07:54:51 compute-0 sshd-session[231275]: Disconnected from invalid user server 152.32.144.167 port 58260 [preauth]
Sep 30 07:54:54 compute-0 nova_compute[189265]: 2025-09-30 07:54:54.240 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:54:55 compute-0 nova_compute[189265]: 2025-09-30 07:54:55.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:54:55 compute-0 podman[231299]: 2025-09-30 07:54:55.53288002 +0000 UTC m=+0.115693748 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, release=1755695350, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, architecture=x86_64)
Sep 30 07:54:59 compute-0 nova_compute[189265]: 2025-09-30 07:54:59.241 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:54:59 compute-0 podman[231321]: 2025-09-30 07:54:59.367102872 +0000 UTC m=+0.085037243 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, managed_by=edpm_ansible, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 07:54:59 compute-0 podman[231322]: 2025-09-30 07:54:59.407693623 +0000 UTC m=+0.115463371 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20250930, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 07:54:59 compute-0 podman[231323]: 2025-09-30 07:54:59.460679121 +0000 UTC m=+0.169030506 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Sep 30 07:54:59 compute-0 podman[199733]: time="2025-09-30T07:54:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:54:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:54:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:54:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:54:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3016 "" "Go-http-client/1.1"
Sep 30 07:55:00 compute-0 nova_compute[189265]: 2025-09-30 07:55:00.441 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:55:01 compute-0 openstack_network_exporter[201859]: ERROR   07:55:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:55:01 compute-0 openstack_network_exporter[201859]: ERROR   07:55:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:55:01 compute-0 openstack_network_exporter[201859]: ERROR   07:55:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:55:01 compute-0 openstack_network_exporter[201859]: ERROR   07:55:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:55:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:55:01 compute-0 openstack_network_exporter[201859]: ERROR   07:55:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:55:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:55:04 compute-0 nova_compute[189265]: 2025-09-30 07:55:04.243 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:55:04 compute-0 nova_compute[189265]: 2025-09-30 07:55:04.293 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:55:04 compute-0 nova_compute[189265]: 2025-09-30 07:55:04.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:55:05 compute-0 nova_compute[189265]: 2025-09-30 07:55:05.443 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:55:05 compute-0 nova_compute[189265]: 2025-09-30 07:55:05.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:55:07 compute-0 nova_compute[189265]: 2025-09-30 07:55:07.133 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:55:09 compute-0 nova_compute[189265]: 2025-09-30 07:55:09.244 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:55:10 compute-0 nova_compute[189265]: 2025-09-30 07:55:10.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:55:10 compute-0 podman[231383]: 2025-09-30 07:55:10.529918419 +0000 UTC m=+0.081041638 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 07:55:11 compute-0 nova_compute[189265]: 2025-09-30 07:55:11.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:55:12 compute-0 nova_compute[189265]: 2025-09-30 07:55:12.309 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:55:12 compute-0 nova_compute[189265]: 2025-09-30 07:55:12.310 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:55:12 compute-0 nova_compute[189265]: 2025-09-30 07:55:12.310 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:55:12 compute-0 nova_compute[189265]: 2025-09-30 07:55:12.310 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:55:12 compute-0 nova_compute[189265]: 2025-09-30 07:55:12.545 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:55:12 compute-0 nova_compute[189265]: 2025-09-30 07:55:12.547 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:55:12 compute-0 nova_compute[189265]: 2025-09-30 07:55:12.578 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.031s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:55:12 compute-0 nova_compute[189265]: 2025-09-30 07:55:12.579 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5827MB free_disk=73.29514694213867GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:55:12 compute-0 nova_compute[189265]: 2025-09-30 07:55:12.580 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:55:12 compute-0 nova_compute[189265]: 2025-09-30 07:55:12.580 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:55:13 compute-0 nova_compute[189265]: 2025-09-30 07:55:13.774 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:55:13 compute-0 nova_compute[189265]: 2025-09-30 07:55:13.775 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:55:12 up  1:52,  0 user,  load average: 0.30, 0.35, 0.30\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:55:13 compute-0 nova_compute[189265]: 2025-09-30 07:55:13.852 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Refreshing inventories for resource provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:822
Sep 30 07:55:13 compute-0 nova_compute[189265]: 2025-09-30 07:55:13.924 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Updating ProviderTree inventory for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:786
Sep 30 07:55:13 compute-0 nova_compute[189265]: 2025-09-30 07:55:13.925 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Updating inventory in ProviderTree for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:176
Sep 30 07:55:13 compute-0 nova_compute[189265]: 2025-09-30 07:55:13.942 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Refreshing aggregate associations for resource provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc, aggregates: None _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:831
Sep 30 07:55:14 compute-0 nova_compute[189265]: 2025-09-30 07:55:14.036 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Refreshing trait associations for resource provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc, traits: COMPUTE_SECURITY_TPM_CRB,HW_ARCH_X86_64,HW_CPU_X86_F16C,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_AVX2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_ADDRESS_SPACE_EMULATED,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_ADDRESS_SPACE_PASSTHROUGH,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,HW_CPU_X86_AESNI,COMPUTE_STORAGE_VIRTIO_FS,HW_CPU_X86_FMA3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SSE,COMPUTE_ACCELERATORS,HW_CPU_X86_SVM,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VOLUME_EXTEND,COMPUTE_ARCH_X86_64,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SHA,HW_CPU_X86_BMI,COMPUTE_SOUND_MODEL_USB,COMPUTE_SOUND_MODEL_SB16,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_STORAGE_BUS_USB,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_SECURITY_STATELESS_FIRMWARE,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AMD_SVM,HW_CPU_X86_BMI2,COMPUTE_SOUND_MODEL_ICH9,COMPUTE_SOUND_MODEL_ES1370,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_SOUND_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_TIS,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_VGA,HW_CPU_X86_AVX,COMPUTE_SOUND_MODEL_AC97,COMPUTE_DEVICE_TAGGING,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_ABM,COMPUTE_NET_VIRTIO_PACKED,COMPUTE_NET_VIF_MODEL_IGB,HW_CPU_X86_SSE41,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_SOUND_MODEL_ICH6,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_MMX,HW_CPU_X86_SSE4A,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_SOUND_MODEL_PCSPK,HW_CPU_X86_CLMUL _refresh_associations /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:843
Sep 30 07:55:14 compute-0 nova_compute[189265]: 2025-09-30 07:55:14.204 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:55:14 compute-0 nova_compute[189265]: 2025-09-30 07:55:14.246 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:55:14 compute-0 unix_chkpwd[231411]: password check failed for user (root)
Sep 30 07:55:14 compute-0 sshd-session[231408]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=159.89.22.242  user=root
Sep 30 07:55:14 compute-0 nova_compute[189265]: 2025-09-30 07:55:14.710 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:55:15 compute-0 nova_compute[189265]: 2025-09-30 07:55:15.219 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:55:15 compute-0 nova_compute[189265]: 2025-09-30 07:55:15.219 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.638s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:55:15 compute-0 nova_compute[189265]: 2025-09-30 07:55:15.446 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:55:16 compute-0 nova_compute[189265]: 2025-09-30 07:55:16.219 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:55:16 compute-0 nova_compute[189265]: 2025-09-30 07:55:16.219 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:55:16 compute-0 nova_compute[189265]: 2025-09-30 07:55:16.220 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:55:16 compute-0 nova_compute[189265]: 2025-09-30 07:55:16.220 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:55:16 compute-0 unix_chkpwd[231413]: password check failed for user (root)
Sep 30 07:55:16 compute-0 sshd-session[231410]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=185.156.73.233  user=root
Sep 30 07:55:16 compute-0 sshd-session[231408]: Failed password for root from 159.89.22.242 port 51092 ssh2
Sep 30 07:55:17 compute-0 sshd-session[231410]: Failed password for root from 185.156.73.233 port 35736 ssh2
Sep 30 07:55:18 compute-0 sshd-session[231410]: Connection closed by authenticating user root 185.156.73.233 port 35736 [preauth]
Sep 30 07:55:18 compute-0 sshd-session[231408]: Received disconnect from 159.89.22.242 port 51092:11: Bye Bye [preauth]
Sep 30 07:55:18 compute-0 sshd-session[231408]: Disconnected from authenticating user root 159.89.22.242 port 51092 [preauth]
Sep 30 07:55:19 compute-0 nova_compute[189265]: 2025-09-30 07:55:19.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:55:20 compute-0 nova_compute[189265]: 2025-09-30 07:55:20.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:55:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:55:20.608 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:55:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:55:20.608 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:55:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:55:20.608 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:55:21 compute-0 podman[231415]: 2025-09-30 07:55:21.482067138 +0000 UTC m=+0.069246488 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=iscsid)
Sep 30 07:55:22 compute-0 nova_compute[189265]: 2025-09-30 07:55:22.789 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:55:24 compute-0 nova_compute[189265]: 2025-09-30 07:55:24.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:55:25 compute-0 nova_compute[189265]: 2025-09-30 07:55:25.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:55:26 compute-0 podman[231436]: 2025-09-30 07:55:26.542717044 +0000 UTC m=+0.115884344 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, distribution-scope=public, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, config_id=edpm, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., vcs-type=git, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Sep 30 07:55:29 compute-0 nova_compute[189265]: 2025-09-30 07:55:29.252 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:55:29 compute-0 podman[231457]: 2025-09-30 07:55:29.488139702 +0000 UTC m=+0.065807549 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true)
Sep 30 07:55:29 compute-0 podman[231477]: 2025-09-30 07:55:29.615537577 +0000 UTC m=+0.077940299 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20250930)
Sep 30 07:55:29 compute-0 podman[231478]: 2025-09-30 07:55:29.650360991 +0000 UTC m=+0.114039700 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest)
Sep 30 07:55:29 compute-0 podman[199733]: time="2025-09-30T07:55:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:55:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:55:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:55:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:55:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3014 "" "Go-http-client/1.1"
Sep 30 07:55:30 compute-0 nova_compute[189265]: 2025-09-30 07:55:30.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:55:31 compute-0 openstack_network_exporter[201859]: ERROR   07:55:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:55:31 compute-0 openstack_network_exporter[201859]: ERROR   07:55:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:55:31 compute-0 openstack_network_exporter[201859]: ERROR   07:55:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:55:31 compute-0 openstack_network_exporter[201859]: ERROR   07:55:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:55:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:55:31 compute-0 openstack_network_exporter[201859]: ERROR   07:55:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:55:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:55:34 compute-0 nova_compute[189265]: 2025-09-30 07:55:34.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:55:35 compute-0 nova_compute[189265]: 2025-09-30 07:55:35.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:55:39 compute-0 nova_compute[189265]: 2025-09-30 07:55:39.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:55:40 compute-0 nova_compute[189265]: 2025-09-30 07:55:40.560 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:55:41 compute-0 podman[231518]: 2025-09-30 07:55:41.506902205 +0000 UTC m=+0.076636551 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 07:55:42 compute-0 sshd-session[231519]: Invalid user test1 from 103.57.223.153 port 49366
Sep 30 07:55:42 compute-0 sshd-session[231519]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:55:42 compute-0 sshd-session[231519]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.57.223.153
Sep 30 07:55:44 compute-0 nova_compute[189265]: 2025-09-30 07:55:44.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:55:44 compute-0 sshd-session[231519]: Failed password for invalid user test1 from 103.57.223.153 port 49366 ssh2
Sep 30 07:55:45 compute-0 sshd-session[231519]: Received disconnect from 103.57.223.153 port 49366:11: Bye Bye [preauth]
Sep 30 07:55:45 compute-0 sshd-session[231519]: Disconnected from invalid user test1 103.57.223.153 port 49366 [preauth]
Sep 30 07:55:45 compute-0 nova_compute[189265]: 2025-09-30 07:55:45.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:55:49 compute-0 nova_compute[189265]: 2025-09-30 07:55:49.301 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:55:50 compute-0 nova_compute[189265]: 2025-09-30 07:55:50.615 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:55:50 compute-0 sshd-session[231546]: Invalid user minecraft from 152.32.144.167 port 37388
Sep 30 07:55:50 compute-0 sshd-session[231546]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:55:50 compute-0 sshd-session[231546]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=152.32.144.167
Sep 30 07:55:52 compute-0 podman[231548]: 2025-09-30 07:55:52.50778114 +0000 UTC m=+0.086800634 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.4, config_id=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Sep 30 07:55:52 compute-0 sshd-session[231546]: Failed password for invalid user minecraft from 152.32.144.167 port 37388 ssh2
Sep 30 07:55:53 compute-0 sshd-session[231546]: Received disconnect from 152.32.144.167 port 37388:11: Bye Bye [preauth]
Sep 30 07:55:53 compute-0 sshd-session[231546]: Disconnected from invalid user minecraft 152.32.144.167 port 37388 [preauth]
Sep 30 07:55:54 compute-0 nova_compute[189265]: 2025-09-30 07:55:54.303 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:55:55 compute-0 nova_compute[189265]: 2025-09-30 07:55:55.670 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:55:57 compute-0 podman[231566]: 2025-09-30 07:55:57.500075122 +0000 UTC m=+0.082598603 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_id=edpm, io.buildah.version=1.33.7, name=ubi9-minimal, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Sep 30 07:55:59 compute-0 nova_compute[189265]: 2025-09-30 07:55:59.307 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:55:59 compute-0 podman[199733]: time="2025-09-30T07:55:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:55:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:55:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:55:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:55:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3018 "" "Go-http-client/1.1"
Sep 30 07:56:00 compute-0 podman[231589]: 2025-09-30 07:56:00.486338119 +0000 UTC m=+0.063160163 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20250930, config_id=ovn_metadata_agent)
Sep 30 07:56:00 compute-0 podman[231588]: 2025-09-30 07:56:00.500146068 +0000 UTC m=+0.082293975 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20250930, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Sep 30 07:56:00 compute-0 podman[231590]: 2025-09-30 07:56:00.546260008 +0000 UTC m=+0.120489647 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest)
Sep 30 07:56:00 compute-0 nova_compute[189265]: 2025-09-30 07:56:00.671 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:56:01 compute-0 openstack_network_exporter[201859]: ERROR   07:56:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:56:01 compute-0 openstack_network_exporter[201859]: ERROR   07:56:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:56:01 compute-0 openstack_network_exporter[201859]: ERROR   07:56:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:56:01 compute-0 openstack_network_exporter[201859]: ERROR   07:56:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:56:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:56:01 compute-0 openstack_network_exporter[201859]: ERROR   07:56:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:56:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:56:04 compute-0 sshd-session[231651]: error: kex_exchange_identification: read: Connection reset by peer
Sep 30 07:56:04 compute-0 sshd-session[231651]: Connection reset by 81.237.205.92 port 34402
Sep 30 07:56:04 compute-0 nova_compute[189265]: 2025-09-30 07:56:04.309 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:56:05 compute-0 nova_compute[189265]: 2025-09-30 07:56:05.674 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:56:05 compute-0 nova_compute[189265]: 2025-09-30 07:56:05.784 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:56:06 compute-0 unix_chkpwd[231654]: password check failed for user (root)
Sep 30 07:56:06 compute-0 sshd-session[231652]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=159.89.22.242  user=root
Sep 30 07:56:06 compute-0 nova_compute[189265]: 2025-09-30 07:56:06.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:56:07 compute-0 nova_compute[189265]: 2025-09-30 07:56:07.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:56:08 compute-0 sshd-session[231652]: Failed password for root from 159.89.22.242 port 59688 ssh2
Sep 30 07:56:08 compute-0 sshd-session[231652]: Received disconnect from 159.89.22.242 port 59688:11: Bye Bye [preauth]
Sep 30 07:56:08 compute-0 sshd-session[231652]: Disconnected from authenticating user root 159.89.22.242 port 59688 [preauth]
Sep 30 07:56:09 compute-0 nova_compute[189265]: 2025-09-30 07:56:09.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:56:10 compute-0 nova_compute[189265]: 2025-09-30 07:56:10.678 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:56:12 compute-0 podman[231655]: 2025-09-30 07:56:12.528166167 +0000 UTC m=+0.109718416 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Sep 30 07:56:13 compute-0 nova_compute[189265]: 2025-09-30 07:56:13.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:56:14 compute-0 nova_compute[189265]: 2025-09-30 07:56:14.308 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:56:14 compute-0 nova_compute[189265]: 2025-09-30 07:56:14.309 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:56:14 compute-0 nova_compute[189265]: 2025-09-30 07:56:14.309 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:56:14 compute-0 nova_compute[189265]: 2025-09-30 07:56:14.309 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:56:14 compute-0 nova_compute[189265]: 2025-09-30 07:56:14.313 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:56:14 compute-0 nova_compute[189265]: 2025-09-30 07:56:14.504 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:56:14 compute-0 nova_compute[189265]: 2025-09-30 07:56:14.505 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:56:14 compute-0 nova_compute[189265]: 2025-09-30 07:56:14.526 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.021s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:56:14 compute-0 nova_compute[189265]: 2025-09-30 07:56:14.527 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5836MB free_disk=73.29341888427734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:56:14 compute-0 nova_compute[189265]: 2025-09-30 07:56:14.527 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:56:14 compute-0 nova_compute[189265]: 2025-09-30 07:56:14.528 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:56:15 compute-0 nova_compute[189265]: 2025-09-30 07:56:15.574 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:56:15 compute-0 nova_compute[189265]: 2025-09-30 07:56:15.574 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:56:14 up  1:53,  0 user,  load average: 0.10, 0.28, 0.28\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:56:15 compute-0 nova_compute[189265]: 2025-09-30 07:56:15.596 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:56:15 compute-0 nova_compute[189265]: 2025-09-30 07:56:15.680 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:56:16 compute-0 nova_compute[189265]: 2025-09-30 07:56:16.101 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:56:16 compute-0 nova_compute[189265]: 2025-09-30 07:56:16.610 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:56:16 compute-0 nova_compute[189265]: 2025-09-30 07:56:16.610 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.082s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:56:17 compute-0 nova_compute[189265]: 2025-09-30 07:56:17.610 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:56:17 compute-0 nova_compute[189265]: 2025-09-30 07:56:17.611 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:56:17 compute-0 nova_compute[189265]: 2025-09-30 07:56:17.612 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:56:17 compute-0 nova_compute[189265]: 2025-09-30 07:56:17.612 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:56:19 compute-0 nova_compute[189265]: 2025-09-30 07:56:19.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:56:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:56:20.609 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:56:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:56:20.610 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:56:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:56:20.610 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:56:20 compute-0 nova_compute[189265]: 2025-09-30 07:56:20.681 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:56:23 compute-0 podman[231681]: 2025-09-30 07:56:23.493437785 +0000 UTC m=+0.067943951 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=iscsid, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS)
Sep 30 07:56:24 compute-0 nova_compute[189265]: 2025-09-30 07:56:24.318 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:56:24 compute-0 nova_compute[189265]: 2025-09-30 07:56:24.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:56:25 compute-0 nova_compute[189265]: 2025-09-30 07:56:25.685 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:56:28 compute-0 podman[231701]: 2025-09-30 07:56:28.506453827 +0000 UTC m=+0.077635120 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, release=1755695350, vendor=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.openshift.expose-services=, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, version=9.6, vcs-type=git)
Sep 30 07:56:29 compute-0 nova_compute[189265]: 2025-09-30 07:56:29.348 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:56:29 compute-0 podman[199733]: time="2025-09-30T07:56:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:56:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:56:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:56:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:56:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3014 "" "Go-http-client/1.1"
Sep 30 07:56:30 compute-0 nova_compute[189265]: 2025-09-30 07:56:30.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:56:31 compute-0 openstack_network_exporter[201859]: ERROR   07:56:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:56:31 compute-0 openstack_network_exporter[201859]: ERROR   07:56:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:56:31 compute-0 openstack_network_exporter[201859]: ERROR   07:56:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:56:31 compute-0 openstack_network_exporter[201859]: ERROR   07:56:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:56:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:56:31 compute-0 openstack_network_exporter[201859]: ERROR   07:56:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:56:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:56:31 compute-0 podman[231722]: 2025-09-30 07:56:31.520459914 +0000 UTC m=+0.088984918 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 07:56:31 compute-0 podman[231723]: 2025-09-30 07:56:31.541460869 +0000 UTC m=+0.103335881 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_build_tag=watcher_latest, tcib_managed=true, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 07:56:31 compute-0 podman[231724]: 2025-09-30 07:56:31.582775491 +0000 UTC m=+0.138424434 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team)
Sep 30 07:56:34 compute-0 nova_compute[189265]: 2025-09-30 07:56:34.361 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:56:35 compute-0 nova_compute[189265]: 2025-09-30 07:56:35.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:56:35 compute-0 nova_compute[189265]: 2025-09-30 07:56:35.783 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:56:39 compute-0 nova_compute[189265]: 2025-09-30 07:56:39.363 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:56:40 compute-0 nova_compute[189265]: 2025-09-30 07:56:40.808 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:56:41 compute-0 unix_chkpwd[231789]: password check failed for user (root)
Sep 30 07:56:41 compute-0 sshd-session[231787]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.28  user=root
Sep 30 07:56:43 compute-0 podman[231790]: 2025-09-30 07:56:43.501627823 +0000 UTC m=+0.083858720 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 07:56:43 compute-0 sshd-session[231787]: Failed password for root from 91.224.92.28 port 33652 ssh2
Sep 30 07:56:44 compute-0 unix_chkpwd[231814]: password check failed for user (root)
Sep 30 07:56:44 compute-0 nova_compute[189265]: 2025-09-30 07:56:44.407 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:56:45 compute-0 nova_compute[189265]: 2025-09-30 07:56:45.808 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:56:46 compute-0 sshd-session[231787]: Failed password for root from 91.224.92.28 port 33652 ssh2
Sep 30 07:56:48 compute-0 unix_chkpwd[231815]: password check failed for user (root)
Sep 30 07:56:49 compute-0 nova_compute[189265]: 2025-09-30 07:56:49.456 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:56:50 compute-0 sshd-session[231787]: Failed password for root from 91.224.92.28 port 33652 ssh2
Sep 30 07:56:50 compute-0 nova_compute[189265]: 2025-09-30 07:56:50.851 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:56:51 compute-0 sshd-session[231816]: Invalid user superadmin from 103.57.223.153 port 36922
Sep 30 07:56:51 compute-0 sshd-session[231816]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:56:51 compute-0 sshd-session[231816]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.57.223.153
Sep 30 07:56:52 compute-0 sshd-session[231787]: Received disconnect from 91.224.92.28 port 33652:11:  [preauth]
Sep 30 07:56:52 compute-0 sshd-session[231787]: Disconnected from authenticating user root 91.224.92.28 port 33652 [preauth]
Sep 30 07:56:52 compute-0 sshd-session[231787]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.28  user=root
Sep 30 07:56:53 compute-0 unix_chkpwd[231820]: password check failed for user (root)
Sep 30 07:56:53 compute-0 sshd-session[231818]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.28  user=root
Sep 30 07:56:53 compute-0 sshd-session[231816]: Failed password for invalid user superadmin from 103.57.223.153 port 36922 ssh2
Sep 30 07:56:53 compute-0 sshd-session[231816]: Received disconnect from 103.57.223.153 port 36922:11: Bye Bye [preauth]
Sep 30 07:56:53 compute-0 sshd-session[231816]: Disconnected from invalid user superadmin 103.57.223.153 port 36922 [preauth]
Sep 30 07:56:54 compute-0 nova_compute[189265]: 2025-09-30 07:56:54.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:56:54 compute-0 unix_chkpwd[231843]: password check failed for user (root)
Sep 30 07:56:54 compute-0 sshd-session[231821]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=152.32.144.167  user=root
Sep 30 07:56:54 compute-0 podman[231823]: 2025-09-30 07:56:54.551965464 +0000 UTC m=+0.129638269 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, tcib_build_tag=watcher_latest, tcib_managed=true)
Sep 30 07:56:55 compute-0 sshd-session[231818]: Failed password for root from 91.224.92.28 port 33286 ssh2
Sep 30 07:56:55 compute-0 unix_chkpwd[231844]: password check failed for user (root)
Sep 30 07:56:55 compute-0 nova_compute[189265]: 2025-09-30 07:56:55.852 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:56:56 compute-0 sshd-session[231821]: Failed password for root from 152.32.144.167 port 36676 ssh2
Sep 30 07:56:57 compute-0 sshd-session[231818]: Failed password for root from 91.224.92.28 port 33286 ssh2
Sep 30 07:56:58 compute-0 unix_chkpwd[231847]: password check failed for user (root)
Sep 30 07:56:58 compute-0 sshd-session[231845]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=159.89.22.242  user=root
Sep 30 07:56:58 compute-0 sshd-session[231821]: Received disconnect from 152.32.144.167 port 36676:11: Bye Bye [preauth]
Sep 30 07:56:58 compute-0 sshd-session[231821]: Disconnected from authenticating user root 152.32.144.167 port 36676 [preauth]
Sep 30 07:56:59 compute-0 podman[231848]: 2025-09-30 07:56:59.52460667 +0000 UTC m=+0.106134912 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, vcs-type=git, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9)
Sep 30 07:56:59 compute-0 unix_chkpwd[231870]: password check failed for user (root)
Sep 30 07:56:59 compute-0 nova_compute[189265]: 2025-09-30 07:56:59.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:56:59 compute-0 podman[199733]: time="2025-09-30T07:56:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:56:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:56:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:56:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:56:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3015 "" "Go-http-client/1.1"
Sep 30 07:57:00 compute-0 sshd-session[231845]: Failed password for root from 159.89.22.242 port 45208 ssh2
Sep 30 07:57:00 compute-0 sshd-session[231845]: Received disconnect from 159.89.22.242 port 45208:11: Bye Bye [preauth]
Sep 30 07:57:00 compute-0 sshd-session[231845]: Disconnected from authenticating user root 159.89.22.242 port 45208 [preauth]
Sep 30 07:57:00 compute-0 nova_compute[189265]: 2025-09-30 07:57:00.903 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:57:01 compute-0 openstack_network_exporter[201859]: ERROR   07:57:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:57:01 compute-0 openstack_network_exporter[201859]: ERROR   07:57:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:57:01 compute-0 openstack_network_exporter[201859]: ERROR   07:57:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:57:01 compute-0 openstack_network_exporter[201859]: ERROR   07:57:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:57:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:57:01 compute-0 openstack_network_exporter[201859]: ERROR   07:57:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:57:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:57:01 compute-0 sshd-session[231818]: Failed password for root from 91.224.92.28 port 33286 ssh2
Sep 30 07:57:02 compute-0 podman[231872]: 2025-09-30 07:57:02.518848237 +0000 UTC m=+0.087208997 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Sep 30 07:57:02 compute-0 podman[231871]: 2025-09-30 07:57:02.518914879 +0000 UTC m=+0.092363726 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=multipathd, io.buildah.version=1.41.4)
Sep 30 07:57:02 compute-0 podman[231873]: 2025-09-30 07:57:02.635424139 +0000 UTC m=+0.195759848 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=ovn_controller, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 07:57:03 compute-0 sshd-session[231818]: Received disconnect from 91.224.92.28 port 33286:11:  [preauth]
Sep 30 07:57:03 compute-0 sshd-session[231818]: Disconnected from authenticating user root 91.224.92.28 port 33286 [preauth]
Sep 30 07:57:03 compute-0 sshd-session[231818]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.28  user=root
Sep 30 07:57:04 compute-0 unix_chkpwd[231934]: password check failed for user (root)
Sep 30 07:57:04 compute-0 sshd-session[231932]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.28  user=root
Sep 30 07:57:04 compute-0 nova_compute[189265]: 2025-09-30 07:57:04.547 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:57:05 compute-0 nova_compute[189265]: 2025-09-30 07:57:05.907 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:57:06 compute-0 sshd-session[231932]: Failed password for root from 91.224.92.28 port 33604 ssh2
Sep 30 07:57:06 compute-0 unix_chkpwd[231935]: password check failed for user (root)
Sep 30 07:57:06 compute-0 nova_compute[189265]: 2025-09-30 07:57:06.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:57:07 compute-0 nova_compute[189265]: 2025-09-30 07:57:07.783 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:57:07 compute-0 nova_compute[189265]: 2025-09-30 07:57:07.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:57:08 compute-0 sshd-session[231932]: Failed password for root from 91.224.92.28 port 33604 ssh2
Sep 30 07:57:09 compute-0 nova_compute[189265]: 2025-09-30 07:57:09.550 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:57:10 compute-0 unix_chkpwd[231936]: password check failed for user (root)
Sep 30 07:57:10 compute-0 nova_compute[189265]: 2025-09-30 07:57:10.949 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:57:12 compute-0 sshd-session[231932]: Failed password for root from 91.224.92.28 port 33604 ssh2
Sep 30 07:57:14 compute-0 podman[231937]: 2025-09-30 07:57:14.49288226 +0000 UTC m=+0.074522801 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 07:57:14 compute-0 nova_compute[189265]: 2025-09-30 07:57:14.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:57:14 compute-0 nova_compute[189265]: 2025-09-30 07:57:14.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:57:14 compute-0 sshd-session[231932]: Received disconnect from 91.224.92.28 port 33604:11:  [preauth]
Sep 30 07:57:14 compute-0 sshd-session[231932]: Disconnected from authenticating user root 91.224.92.28 port 33604 [preauth]
Sep 30 07:57:14 compute-0 sshd-session[231932]: PAM 2 more authentication failures; logname= uid=0 euid=0 tty=ssh ruser= rhost=91.224.92.28  user=root
Sep 30 07:57:15 compute-0 nova_compute[189265]: 2025-09-30 07:57:15.306 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:57:15 compute-0 nova_compute[189265]: 2025-09-30 07:57:15.307 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:57:15 compute-0 nova_compute[189265]: 2025-09-30 07:57:15.307 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:57:15 compute-0 nova_compute[189265]: 2025-09-30 07:57:15.307 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:57:15 compute-0 nova_compute[189265]: 2025-09-30 07:57:15.513 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:57:15 compute-0 nova_compute[189265]: 2025-09-30 07:57:15.514 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:57:15 compute-0 nova_compute[189265]: 2025-09-30 07:57:15.544 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.030s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:57:15 compute-0 nova_compute[189265]: 2025-09-30 07:57:15.545 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5830MB free_disk=73.29341888427734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:57:15 compute-0 nova_compute[189265]: 2025-09-30 07:57:15.545 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:57:15 compute-0 nova_compute[189265]: 2025-09-30 07:57:15.545 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:57:15 compute-0 nova_compute[189265]: 2025-09-30 07:57:15.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:57:16 compute-0 nova_compute[189265]: 2025-09-30 07:57:16.621 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:57:16 compute-0 nova_compute[189265]: 2025-09-30 07:57:16.621 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:57:15 up  1:54,  0 user,  load average: 0.04, 0.23, 0.26\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:57:16 compute-0 nova_compute[189265]: 2025-09-30 07:57:16.654 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:57:17 compute-0 nova_compute[189265]: 2025-09-30 07:57:17.202 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:57:17 compute-0 nova_compute[189265]: 2025-09-30 07:57:17.713 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:57:17 compute-0 nova_compute[189265]: 2025-09-30 07:57:17.714 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.168s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:57:18 compute-0 nova_compute[189265]: 2025-09-30 07:57:18.714 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:57:18 compute-0 nova_compute[189265]: 2025-09-30 07:57:18.715 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:57:18 compute-0 nova_compute[189265]: 2025-09-30 07:57:18.715 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:57:18 compute-0 nova_compute[189265]: 2025-09-30 07:57:18.715 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:57:19 compute-0 nova_compute[189265]: 2025-09-30 07:57:19.553 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:57:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:57:20.611 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:57:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:57:20.611 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:57:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:57:20.611 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:57:20 compute-0 nova_compute[189265]: 2025-09-30 07:57:20.952 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:57:24 compute-0 nova_compute[189265]: 2025-09-30 07:57:24.592 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:57:25 compute-0 podman[231963]: 2025-09-30 07:57:25.500328485 +0000 UTC m=+0.082069988 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible)
Sep 30 07:57:25 compute-0 nova_compute[189265]: 2025-09-30 07:57:25.789 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:57:25 compute-0 nova_compute[189265]: 2025-09-30 07:57:25.952 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:57:29 compute-0 nova_compute[189265]: 2025-09-30 07:57:29.639 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:57:29 compute-0 podman[199733]: time="2025-09-30T07:57:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:57:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:57:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:57:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:57:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3017 "" "Go-http-client/1.1"
Sep 30 07:57:29 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:57:29.848 100322 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=38, options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '1a:26:7c', 'max_tunid': '16711680', 'northd_internal_version': '24.09.4-20.37.0-77.8', 'svc_monitor_mac': '2e:60:fa:91:d0:34'}, ipsec=False) old=SB_Global(nb_cfg=37) matches /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/event.py:55
Sep 30 07:57:29 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:57:29.849 100322 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.12/site-packages/neutron/agent/ovn/metadata/agent.py:367
Sep 30 07:57:29 compute-0 nova_compute[189265]: 2025-09-30 07:57:29.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:57:30 compute-0 podman[231984]: 2025-09-30 07:57:30.496687955 +0000 UTC m=+0.075749056 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.openshift.tags=minimal rhel9, release=1755695350, maintainer=Red Hat, Inc., vcs-type=git, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Sep 30 07:57:30 compute-0 nova_compute[189265]: 2025-09-30 07:57:30.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:57:31 compute-0 openstack_network_exporter[201859]: ERROR   07:57:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:57:31 compute-0 openstack_network_exporter[201859]: ERROR   07:57:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:57:31 compute-0 openstack_network_exporter[201859]: ERROR   07:57:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:57:31 compute-0 openstack_network_exporter[201859]: ERROR   07:57:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:57:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:57:31 compute-0 openstack_network_exporter[201859]: ERROR   07:57:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:57:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:57:33 compute-0 podman[232007]: 2025-09-30 07:57:33.518285271 +0000 UTC m=+0.086218078 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, config_id=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2)
Sep 30 07:57:33 compute-0 podman[232006]: 2025-09-30 07:57:33.546757462 +0000 UTC m=+0.120197988 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, tcib_managed=true, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest)
Sep 30 07:57:33 compute-0 podman[232008]: 2025-09-30 07:57:33.570648211 +0000 UTC m=+0.133996696 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, tcib_build_tag=watcher_latest, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible)
Sep 30 07:57:34 compute-0 nova_compute[189265]: 2025-09-30 07:57:34.640 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:57:36 compute-0 nova_compute[189265]: 2025-09-30 07:57:36.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:57:39 compute-0 nova_compute[189265]: 2025-09-30 07:57:39.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:57:39 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:57:39.850 100322 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=01429670-4ea1-4dab-babc-4bc628cc01bb, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '38'}),), if_exists=True) do_commit /usr/lib/python3.12/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Sep 30 07:57:41 compute-0 nova_compute[189265]: 2025-09-30 07:57:41.089 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:57:44 compute-0 nova_compute[189265]: 2025-09-30 07:57:44.698 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:57:45 compute-0 podman[232070]: 2025-09-30 07:57:45.490661374 +0000 UTC m=+0.073451150 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 07:57:46 compute-0 nova_compute[189265]: 2025-09-30 07:57:46.128 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:57:49 compute-0 nova_compute[189265]: 2025-09-30 07:57:49.699 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:57:51 compute-0 nova_compute[189265]: 2025-09-30 07:57:51.131 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:57:52 compute-0 unix_chkpwd[232096]: password check failed for user (root)
Sep 30 07:57:52 compute-0 sshd-session[232094]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=159.89.22.242  user=root
Sep 30 07:57:53 compute-0 sshd-session[232094]: Failed password for root from 159.89.22.242 port 58100 ssh2
Sep 30 07:57:54 compute-0 sshd-session[232094]: Received disconnect from 159.89.22.242 port 58100:11: Bye Bye [preauth]
Sep 30 07:57:54 compute-0 sshd-session[232094]: Disconnected from authenticating user root 159.89.22.242 port 58100 [preauth]
Sep 30 07:57:54 compute-0 nova_compute[189265]: 2025-09-30 07:57:54.727 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:57:56 compute-0 nova_compute[189265]: 2025-09-30 07:57:56.132 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:57:56 compute-0 podman[232097]: 2025-09-30 07:57:56.500315084 +0000 UTC m=+0.079375391 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 07:57:59 compute-0 sshd-session[232117]: Invalid user minecraft from 103.57.223.153 port 58630
Sep 30 07:57:59 compute-0 sshd-session[232117]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:57:59 compute-0 sshd-session[232117]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.57.223.153
Sep 30 07:57:59 compute-0 nova_compute[189265]: 2025-09-30 07:57:59.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:57:59 compute-0 podman[199733]: time="2025-09-30T07:57:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:57:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:57:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:57:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:57:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3017 "" "Go-http-client/1.1"
Sep 30 07:58:01 compute-0 nova_compute[189265]: 2025-09-30 07:58:01.136 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:58:01 compute-0 openstack_network_exporter[201859]: ERROR   07:58:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:58:01 compute-0 openstack_network_exporter[201859]: ERROR   07:58:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:58:01 compute-0 openstack_network_exporter[201859]: ERROR   07:58:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:58:01 compute-0 openstack_network_exporter[201859]: ERROR   07:58:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:58:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:58:01 compute-0 openstack_network_exporter[201859]: ERROR   07:58:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:58:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:58:01 compute-0 podman[232119]: 2025-09-30 07:58:01.508471074 +0000 UTC m=+0.081710158 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, release=1755695350, config_id=edpm, io.openshift.expose-services=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Sep 30 07:58:01 compute-0 sshd-session[232117]: Failed password for invalid user minecraft from 103.57.223.153 port 58630 ssh2
Sep 30 07:58:01 compute-0 sshd-session[232117]: Received disconnect from 103.57.223.153 port 58630:11: Bye Bye [preauth]
Sep 30 07:58:01 compute-0 sshd-session[232117]: Disconnected from invalid user minecraft 103.57.223.153 port 58630 [preauth]
Sep 30 07:58:04 compute-0 podman[232144]: 2025-09-30 07:58:04.513240744 +0000 UTC m=+0.086416443 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=watcher_latest, container_name=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 07:58:04 compute-0 podman[232143]: 2025-09-30 07:58:04.535334282 +0000 UTC m=+0.115890394 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Sep 30 07:58:04 compute-0 podman[232145]: 2025-09-30 07:58:04.563817863 +0000 UTC m=+0.132593065 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Sep 30 07:58:04 compute-0 nova_compute[189265]: 2025-09-30 07:58:04.731 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:58:06 compute-0 nova_compute[189265]: 2025-09-30 07:58:06.139 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:58:07 compute-0 sshd-session[232141]: Invalid user teszt from 152.32.144.167 port 34962
Sep 30 07:58:07 compute-0 sshd-session[232141]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:58:07 compute-0 sshd-session[232141]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=152.32.144.167
Sep 30 07:58:07 compute-0 nova_compute[189265]: 2025-09-30 07:58:07.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:58:08 compute-0 nova_compute[189265]: 2025-09-30 07:58:08.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:58:09 compute-0 sshd-session[232141]: Failed password for invalid user teszt from 152.32.144.167 port 34962 ssh2
Sep 30 07:58:09 compute-0 nova_compute[189265]: 2025-09-30 07:58:09.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:58:09 compute-0 nova_compute[189265]: 2025-09-30 07:58:09.783 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:58:09 compute-0 sshd-session[232141]: Received disconnect from 152.32.144.167 port 34962:11: Bye Bye [preauth]
Sep 30 07:58:09 compute-0 sshd-session[232141]: Disconnected from invalid user teszt 152.32.144.167 port 34962 [preauth]
Sep 30 07:58:11 compute-0 nova_compute[189265]: 2025-09-30 07:58:11.164 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:58:14 compute-0 nova_compute[189265]: 2025-09-30 07:58:14.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:58:14 compute-0 nova_compute[189265]: 2025-09-30 07:58:14.797 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:58:15 compute-0 nova_compute[189265]: 2025-09-30 07:58:15.321 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:58:15 compute-0 nova_compute[189265]: 2025-09-30 07:58:15.322 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:58:15 compute-0 nova_compute[189265]: 2025-09-30 07:58:15.322 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:58:15 compute-0 nova_compute[189265]: 2025-09-30 07:58:15.322 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:58:15 compute-0 nova_compute[189265]: 2025-09-30 07:58:15.518 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:58:15 compute-0 nova_compute[189265]: 2025-09-30 07:58:15.519 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:58:15 compute-0 nova_compute[189265]: 2025-09-30 07:58:15.553 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.034s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:58:15 compute-0 nova_compute[189265]: 2025-09-30 07:58:15.554 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5835MB free_disk=73.2934684753418GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:58:15 compute-0 nova_compute[189265]: 2025-09-30 07:58:15.554 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:58:15 compute-0 nova_compute[189265]: 2025-09-30 07:58:15.554 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:58:16 compute-0 nova_compute[189265]: 2025-09-30 07:58:16.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:58:16 compute-0 podman[232208]: 2025-09-30 07:58:16.481905229 +0000 UTC m=+0.065863084 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Sep 30 07:58:16 compute-0 nova_compute[189265]: 2025-09-30 07:58:16.598 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:58:16 compute-0 nova_compute[189265]: 2025-09-30 07:58:16.599 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:58:15 up  1:55,  0 user,  load average: 0.01, 0.18, 0.24\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:58:16 compute-0 nova_compute[189265]: 2025-09-30 07:58:16.628 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:58:17 compute-0 nova_compute[189265]: 2025-09-30 07:58:17.137 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:58:17 compute-0 nova_compute[189265]: 2025-09-30 07:58:17.646 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:58:17 compute-0 nova_compute[189265]: 2025-09-30 07:58:17.646 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.092s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:58:19 compute-0 nova_compute[189265]: 2025-09-30 07:58:19.648 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:58:19 compute-0 nova_compute[189265]: 2025-09-30 07:58:19.648 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:58:19 compute-0 nova_compute[189265]: 2025-09-30 07:58:19.649 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:58:19 compute-0 nova_compute[189265]: 2025-09-30 07:58:19.649 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:58:19 compute-0 nova_compute[189265]: 2025-09-30 07:58:19.800 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:58:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:58:20.612 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:58:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:58:20.612 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:58:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:58:20.613 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:58:21 compute-0 nova_compute[189265]: 2025-09-30 07:58:21.210 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:58:24 compute-0 nova_compute[189265]: 2025-09-30 07:58:24.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:58:26 compute-0 nova_compute[189265]: 2025-09-30 07:58:26.211 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:58:27 compute-0 podman[232235]: 2025-09-30 07:58:27.508601511 +0000 UTC m=+0.085768717 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, tcib_managed=true, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, container_name=iscsid, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Sep 30 07:58:27 compute-0 nova_compute[189265]: 2025-09-30 07:58:27.789 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:58:29 compute-0 podman[199733]: time="2025-09-30T07:58:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:58:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:58:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:58:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:58:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3018 "" "Go-http-client/1.1"
Sep 30 07:58:29 compute-0 nova_compute[189265]: 2025-09-30 07:58:29.851 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:58:31 compute-0 nova_compute[189265]: 2025-09-30 07:58:31.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:58:31 compute-0 openstack_network_exporter[201859]: ERROR   07:58:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:58:31 compute-0 openstack_network_exporter[201859]: ERROR   07:58:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:58:31 compute-0 openstack_network_exporter[201859]: ERROR   07:58:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:58:31 compute-0 openstack_network_exporter[201859]: ERROR   07:58:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:58:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:58:31 compute-0 openstack_network_exporter[201859]: ERROR   07:58:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:58:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:58:32 compute-0 podman[232255]: 2025-09-30 07:58:32.51480805 +0000 UTC m=+0.090049749 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.tags=minimal rhel9, vcs-type=git, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 07:58:34 compute-0 nova_compute[189265]: 2025-09-30 07:58:34.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:58:35 compute-0 podman[232273]: 2025-09-30 07:58:35.485932231 +0000 UTC m=+0.069178099 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Sep 30 07:58:35 compute-0 podman[232274]: 2025-09-30 07:58:35.499062458 +0000 UTC m=+0.073023789 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 07:58:35 compute-0 podman[232275]: 2025-09-30 07:58:35.54921548 +0000 UTC m=+0.113728609 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS)
Sep 30 07:58:36 compute-0 nova_compute[189265]: 2025-09-30 07:58:36.215 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:58:36 compute-0 nova_compute[189265]: 2025-09-30 07:58:36.783 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:58:39 compute-0 nova_compute[189265]: 2025-09-30 07:58:39.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:58:41 compute-0 nova_compute[189265]: 2025-09-30 07:58:41.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:58:44 compute-0 nova_compute[189265]: 2025-09-30 07:58:44.959 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:58:45 compute-0 unix_chkpwd[232341]: password check failed for user (root)
Sep 30 07:58:45 compute-0 sshd-session[232339]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=159.89.22.242  user=root
Sep 30 07:58:46 compute-0 nova_compute[189265]: 2025-09-30 07:58:46.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:58:47 compute-0 podman[232342]: 2025-09-30 07:58:47.504079097 +0000 UTC m=+0.081865903 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Sep 30 07:58:47 compute-0 sshd-session[232339]: Failed password for root from 159.89.22.242 port 44376 ssh2
Sep 30 07:58:49 compute-0 sshd-session[232339]: Received disconnect from 159.89.22.242 port 44376:11: Bye Bye [preauth]
Sep 30 07:58:49 compute-0 sshd-session[232339]: Disconnected from authenticating user root 159.89.22.242 port 44376 [preauth]
Sep 30 07:58:49 compute-0 nova_compute[189265]: 2025-09-30 07:58:49.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:58:51 compute-0 nova_compute[189265]: 2025-09-30 07:58:51.326 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:58:54 compute-0 nova_compute[189265]: 2025-09-30 07:58:54.995 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:58:56 compute-0 nova_compute[189265]: 2025-09-30 07:58:56.334 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:58:58 compute-0 podman[232366]: 2025-09-30 07:58:58.491365626 +0000 UTC m=+0.067115680 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=watcher_latest, container_name=iscsid, io.buildah.version=1.41.4, org.label-schema.build-date=20250930, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 07:58:59 compute-0 podman[199733]: time="2025-09-30T07:58:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:58:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:58:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:58:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:58:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3015 "" "Go-http-client/1.1"
Sep 30 07:58:59 compute-0 nova_compute[189265]: 2025-09-30 07:58:59.998 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:59:01 compute-0 nova_compute[189265]: 2025-09-30 07:59:01.336 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:59:01 compute-0 openstack_network_exporter[201859]: ERROR   07:59:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:59:01 compute-0 openstack_network_exporter[201859]: ERROR   07:59:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:59:01 compute-0 openstack_network_exporter[201859]: ERROR   07:59:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:59:01 compute-0 openstack_network_exporter[201859]: ERROR   07:59:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:59:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:59:01 compute-0 openstack_network_exporter[201859]: ERROR   07:59:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:59:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:59:03 compute-0 podman[232386]: 2025-09-30 07:59:03.502564768 +0000 UTC m=+0.085448076 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, container_name=openstack_network_exporter, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, build-date=2025-08-20T13:12:41)
Sep 30 07:59:05 compute-0 nova_compute[189265]: 2025-09-30 07:59:05.000 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:59:06 compute-0 nova_compute[189265]: 2025-09-30 07:59:06.339 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:59:06 compute-0 podman[232408]: 2025-09-30 07:59:06.485466769 +0000 UTC m=+0.069028725 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250930, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest)
Sep 30 07:59:06 compute-0 podman[232407]: 2025-09-30 07:59:06.546752261 +0000 UTC m=+0.127422344 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=watcher_latest, container_name=multipathd, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 10 Base Image)
Sep 30 07:59:06 compute-0 podman[232409]: 2025-09-30 07:59:06.573135689 +0000 UTC m=+0.143335481 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_id=ovn_controller, container_name=ovn_controller)
Sep 30 07:59:08 compute-0 unix_chkpwd[232473]: password check failed for user (root)
Sep 30 07:59:08 compute-0 sshd-session[232471]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.57.223.153  user=root
Sep 30 07:59:09 compute-0 nova_compute[189265]: 2025-09-30 07:59:09.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:59:10 compute-0 nova_compute[189265]: 2025-09-30 07:59:10.002 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:59:10 compute-0 nova_compute[189265]: 2025-09-30 07:59:10.783 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:59:10 compute-0 nova_compute[189265]: 2025-09-30 07:59:10.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:59:11 compute-0 nova_compute[189265]: 2025-09-30 07:59:11.380 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:59:11 compute-0 sshd-session[232471]: Failed password for root from 103.57.223.153 port 43950 ssh2
Sep 30 07:59:13 compute-0 sshd-session[232471]: Received disconnect from 103.57.223.153 port 43950:11: Bye Bye [preauth]
Sep 30 07:59:13 compute-0 sshd-session[232471]: Disconnected from authenticating user root 103.57.223.153 port 43950 [preauth]
Sep 30 07:59:14 compute-0 nova_compute[189265]: 2025-09-30 07:59:14.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:59:15 compute-0 nova_compute[189265]: 2025-09-30 07:59:15.004 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:59:15 compute-0 nova_compute[189265]: 2025-09-30 07:59:15.311 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:59:15 compute-0 nova_compute[189265]: 2025-09-30 07:59:15.312 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:59:15 compute-0 nova_compute[189265]: 2025-09-30 07:59:15.312 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:59:15 compute-0 nova_compute[189265]: 2025-09-30 07:59:15.312 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Auditing locally available compute resources for compute-0.ctlplane.example.com (node: compute-0.ctlplane.example.com) update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:937
Sep 30 07:59:15 compute-0 nova_compute[189265]: 2025-09-30 07:59:15.531 2 WARNING nova.virt.libvirt.driver [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Sep 30 07:59:15 compute-0 nova_compute[189265]: 2025-09-30 07:59:15.532 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:349
Sep 30 07:59:15 compute-0 nova_compute[189265]: 2025-09-30 07:59:15.560 2 DEBUG oslo_concurrency.processutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CMD "env LANG=C uptime" returned: 0 in 0.027s execute /usr/lib/python3.12/site-packages/oslo_concurrency/processutils.py:372
Sep 30 07:59:15 compute-0 nova_compute[189265]: 2025-09-30 07:59:15.561 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Hypervisor/Node resource view: name=compute-0.ctlplane.example.com free_ram=5839MB free_disk=73.29348754882812GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1136
Sep 30 07:59:15 compute-0 nova_compute[189265]: 2025-09-30 07:59:15.562 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:59:15 compute-0 nova_compute[189265]: 2025-09-30 07:59:15.562 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:59:16 compute-0 nova_compute[189265]: 2025-09-30 07:59:16.384 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:59:16 compute-0 nova_compute[189265]: 2025-09-30 07:59:16.633 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1159
Sep 30 07:59:16 compute-0 nova_compute[189265]: 2025-09-30 07:59:16.633 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Final resource view: name=compute-0.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] stats={'failed_builds': '0', 'uptime': ' 07:59:15 up  1:56,  0 user,  load average: 0.00, 0.15, 0.22\n'} _report_final_resource_view /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1168
Sep 30 07:59:16 compute-0 nova_compute[189265]: 2025-09-30 07:59:16.660 2 DEBUG nova.compute.provider_tree [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed in ProviderTree for provider: 15ca5e4e-ba83-43d2-ad70-d195a46df5cc update_inventory /usr/lib/python3.12/site-packages/nova/compute/provider_tree.py:180
Sep 30 07:59:17 compute-0 nova_compute[189265]: 2025-09-30 07:59:17.171 2 DEBUG nova.scheduler.client.report [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Inventory has not changed for provider 15ca5e4e-ba83-43d2-ad70-d195a46df5cc based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.12/site-packages/nova/scheduler/client/report.py:958
Sep 30 07:59:17 compute-0 nova_compute[189265]: 2025-09-30 07:59:17.682 2 DEBUG nova.compute.resource_tracker [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Compute_service record updated for compute-0.ctlplane.example.com:compute-0.ctlplane.example.com _update_available_resource /usr/lib/python3.12/site-packages/nova/compute/resource_tracker.py:1097
Sep 30 07:59:17 compute-0 nova_compute[189265]: 2025-09-30 07:59:17.683 2 DEBUG oslo_concurrency.lockutils [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.120s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:59:18 compute-0 unix_chkpwd[232477]: password check failed for user (root)
Sep 30 07:59:18 compute-0 sshd-session[232475]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=152.32.144.167  user=root
Sep 30 07:59:18 compute-0 podman[232478]: 2025-09-30 07:59:18.505448738 +0000 UTC m=+0.079154246 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Sep 30 07:59:20 compute-0 nova_compute[189265]: 2025-09-30 07:59:20.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:59:20 compute-0 sshd-session[232475]: Failed password for root from 152.32.144.167 port 45020 ssh2
Sep 30 07:59:20 compute-0 sshd-session[232475]: Received disconnect from 152.32.144.167 port 45020:11: Bye Bye [preauth]
Sep 30 07:59:20 compute-0 sshd-session[232475]: Disconnected from authenticating user root 152.32.144.167 port 45020 [preauth]
Sep 30 07:59:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:59:20.613 100322 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:405
Sep 30 07:59:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:59:20.614 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:410
Sep 30 07:59:20 compute-0 ovn_metadata_agent[100317]: 2025-09-30 07:59:20.614 100322 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.12/site-packages/oslo_concurrency/lockutils.py:424
Sep 30 07:59:20 compute-0 nova_compute[189265]: 2025-09-30 07:59:20.683 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:59:20 compute-0 nova_compute[189265]: 2025-09-30 07:59:20.683 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:59:20 compute-0 nova_compute[189265]: 2025-09-30 07:59:20.683 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:59:20 compute-0 nova_compute[189265]: 2025-09-30 07:59:20.684 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11228
Sep 30 07:59:21 compute-0 nova_compute[189265]: 2025-09-30 07:59:21.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:59:23 compute-0 nova_compute[189265]: 2025-09-30 07:59:23.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:59:25 compute-0 nova_compute[189265]: 2025-09-30 07:59:25.034 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:59:26 compute-0 nova_compute[189265]: 2025-09-30 07:59:26.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:59:29 compute-0 podman[232504]: 2025-09-30 07:59:29.528588419 +0000 UTC m=+0.111521887 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.4, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=watcher_latest, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.build-date=20250930)
Sep 30 07:59:29 compute-0 podman[199733]: time="2025-09-30T07:59:29Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:59:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:59:29 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:59:29 compute-0 podman[199733]: @ - - [30/Sep/2025:07:59:29 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3022 "" "Go-http-client/1.1"
Sep 30 07:59:30 compute-0 nova_compute[189265]: 2025-09-30 07:59:30.036 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:59:30 compute-0 nova_compute[189265]: 2025-09-30 07:59:30.296 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:59:31 compute-0 openstack_network_exporter[201859]: ERROR   07:59:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:59:31 compute-0 openstack_network_exporter[201859]: ERROR   07:59:31 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 07:59:31 compute-0 openstack_network_exporter[201859]: ERROR   07:59:31 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 07:59:31 compute-0 openstack_network_exporter[201859]: ERROR   07:59:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 07:59:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:59:31 compute-0 openstack_network_exporter[201859]: ERROR   07:59:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 07:59:31 compute-0 openstack_network_exporter[201859]: 
Sep 30 07:59:31 compute-0 nova_compute[189265]: 2025-09-30 07:59:31.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:59:34 compute-0 podman[232524]: 2025-09-30 07:59:34.524607047 +0000 UTC m=+0.101324013 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, build-date=2025-08-20T13:12:41, config_id=edpm, container_name=openstack_network_exporter)
Sep 30 07:59:35 compute-0 nova_compute[189265]: 2025-09-30 07:59:35.047 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:59:35 compute-0 nova_compute[189265]: 2025-09-30 07:59:35.788 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:59:35 compute-0 nova_compute[189265]: 2025-09-30 07:59:35.789 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.12/site-packages/nova/compute/manager.py:11947
Sep 30 07:59:36 compute-0 nova_compute[189265]: 2025-09-30 07:59:36.464 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:59:37 compute-0 podman[232546]: 2025-09-30 07:59:37.479738738 +0000 UTC m=+0.058074450 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, io.buildah.version=1.41.4, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930)
Sep 30 07:59:37 compute-0 podman[232545]: 2025-09-30 07:59:37.515581448 +0000 UTC m=+0.090418389 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20250930, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Sep 30 07:59:37 compute-0 podman[232547]: 2025-09-30 07:59:37.605978586 +0000 UTC m=+0.170655155 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Sep 30 07:59:38 compute-0 sshd-session[232601]: Invalid user scpuser from 159.89.22.242 port 56676
Sep 30 07:59:38 compute-0 sshd-session[232601]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 07:59:38 compute-0 sshd-session[232601]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=159.89.22.242
Sep 30 07:59:39 compute-0 sshd-session[232601]: Failed password for invalid user scpuser from 159.89.22.242 port 56676 ssh2
Sep 30 07:59:40 compute-0 nova_compute[189265]: 2025-09-30 07:59:40.049 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:59:41 compute-0 sshd-session[232601]: Received disconnect from 159.89.22.242 port 56676:11: Bye Bye [preauth]
Sep 30 07:59:41 compute-0 sshd-session[232601]: Disconnected from invalid user scpuser 159.89.22.242 port 56676 [preauth]
Sep 30 07:59:41 compute-0 nova_compute[189265]: 2025-09-30 07:59:41.504 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:59:45 compute-0 nova_compute[189265]: 2025-09-30 07:59:45.052 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:59:46 compute-0 nova_compute[189265]: 2025-09-30 07:59:46.506 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:59:49 compute-0 nova_compute[189265]: 2025-09-30 07:59:49.297 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 07:59:49 compute-0 nova_compute[189265]: 2025-09-30 07:59:49.297 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11909
Sep 30 07:59:49 compute-0 podman[232606]: 2025-09-30 07:59:49.485375785 +0000 UTC m=+0.064583507 container health_status 9335179989af44c755995b77113ebcbc926b5d590bae23423993c92e6bcb6b1b (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Sep 30 07:59:49 compute-0 nova_compute[189265]: 2025-09-30 07:59:49.805 2 DEBUG nova.compute.manager [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.12/site-packages/nova/compute/manager.py:11918
Sep 30 07:59:50 compute-0 nova_compute[189265]: 2025-09-30 07:59:50.054 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:59:51 compute-0 nova_compute[189265]: 2025-09-30 07:59:51.536 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:59:55 compute-0 nova_compute[189265]: 2025-09-30 07:59:55.057 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:59:56 compute-0 nova_compute[189265]: 2025-09-30 07:59:56.539 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 07:59:59 compute-0 podman[199733]: time="2025-09-30T07:59:59Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Sep 30 07:59:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:59:59 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 19529 "" "Go-http-client/1.1"
Sep 30 07:59:59 compute-0 podman[199733]: @ - - [30/Sep/2025:07:59:59 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 3020 "" "Go-http-client/1.1"
Sep 30 08:00:00 compute-0 nova_compute[189265]: 2025-09-30 08:00:00.103 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:00:00 compute-0 podman[232632]: 2025-09-30 08:00:00.47462545 +0000 UTC m=+0.063305440 container health_status d638e3d1cf962970a4b7ff179753faae3dea73f667a39c5cc61c2c91a7b23e2b (image=38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=watcher_latest, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-iscsid:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.4, org.label-schema.build-date=20250930)
Sep 30 08:00:01 compute-0 anacron[168842]: Job `cron.monthly' started
Sep 30 08:00:01 compute-0 anacron[168842]: Job `cron.monthly' terminated
Sep 30 08:00:01 compute-0 anacron[168842]: Normal exit (3 jobs run)
Sep 30 08:00:01 compute-0 sshd-session[232654]: Accepted publickey for zuul from 192.168.122.10 port 53192 ssh2: ECDSA SHA256:VgXY+3KEFg6ByVjpOVk/qpSKqXtLqTtx1W0gQMfs9wE
Sep 30 08:00:01 compute-0 systemd-logind[824]: New session 38 of user zuul.
Sep 30 08:00:01 compute-0 systemd[1]: Started Session 38 of User zuul.
Sep 30 08:00:01 compute-0 sshd-session[232654]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Sep 30 08:00:01 compute-0 sudo[232659]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp -p container,openstack_edpm,system,storage,virt'
Sep 30 08:00:01 compute-0 sudo[232659]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Sep 30 08:00:01 compute-0 openstack_network_exporter[201859]: ERROR   08:00:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 08:00:01 compute-0 openstack_network_exporter[201859]: ERROR   08:00:01 appctl.go:144: Failed to get PID for ovn-northd: no control socket files found for ovn-northd
Sep 30 08:00:01 compute-0 openstack_network_exporter[201859]: ERROR   08:00:01 appctl.go:131: Failed to prepare call to ovsdb-server: no control socket files found for the ovs db server
Sep 30 08:00:01 compute-0 openstack_network_exporter[201859]: ERROR   08:00:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Sep 30 08:00:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 08:00:01 compute-0 openstack_network_exporter[201859]: ERROR   08:00:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Sep 30 08:00:01 compute-0 openstack_network_exporter[201859]: 
Sep 30 08:00:01 compute-0 nova_compute[189265]: 2025-09-30 08:00:01.540 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:00:05 compute-0 nova_compute[189265]: 2025-09-30 08:00:05.125 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:00:05 compute-0 podman[232805]: 2025-09-30 08:00:05.518332278 +0000 UTC m=+0.090585075 container health_status e5a713f16aefec595988dfe1db37d6c4c45e4221c2a66a88de6baf992eb0af70 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.buildah.version=1.33.7, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vcs-type=git, distribution-scope=public, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Sep 30 08:00:06 compute-0 ovs-vsctl[232852]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Sep 30 08:00:06 compute-0 nova_compute[189265]: 2025-09-30 08:00:06.542 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:00:07 compute-0 virtqemud[189090]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Sep 30 08:00:07 compute-0 virtqemud[189090]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Sep 30 08:00:07 compute-0 virtqemud[189090]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Sep 30 08:00:07 compute-0 podman[233050]: 2025-09-30 08:00:07.685122373 +0000 UTC m=+0.082532273 container health_status 586bbe3d5d3db33e0f09f89cd54748f7b32c041002561401cc3919df0e602917 (image=38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=watcher_latest, org.label-schema.build-date=20250930, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-neutron-metadata-agent-ovn:watcher_latest', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Sep 30 08:00:07 compute-0 podman[233044]: 2025-09-30 08:00:07.755277179 +0000 UTC m=+0.152422111 container health_status 4d8752e7b0f934da9ce1ac7737c7761c3a62307aab7e32106ff076dce1ff1ac9 (image=38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-multipathd:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 10 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=watcher_latest, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, io.buildah.version=1.41.4, org.label-schema.build-date=20250930)
Sep 30 08:00:07 compute-0 podman[233082]: 2025-09-30 08:00:07.77547703 +0000 UTC m=+0.137497433 container health_status cde7be08284b93fd445f0bda518b0e715d26d5c2515690507ec6257b721d76dc (image=38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20250930, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': '38.102.83.30:5001/podified-master-centos10/openstack-ovn-controller:watcher_latest', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.4, org.label-schema.name=CentOS Stream 10 Base Image, tcib_build_tag=watcher_latest, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Sep 30 08:00:08 compute-0 crontab[233326]: (root) LIST (root)
Sep 30 08:00:10 compute-0 nova_compute[189265]: 2025-09-30 08:00:10.128 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:00:10 compute-0 nova_compute[189265]: 2025-09-30 08:00:10.296 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:00:10 compute-0 systemd[1]: Starting Hostname Service...
Sep 30 08:00:10 compute-0 systemd[1]: Started Hostname Service.
Sep 30 08:00:10 compute-0 nova_compute[189265]: 2025-09-30 08:00:10.783 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:00:10 compute-0 nova_compute[189265]: 2025-09-30 08:00:10.787 2 DEBUG oslo_service.periodic_task [None req-960a67bd-7346-4bd0-962b-cab966151ce1 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.12/site-packages/oslo_service/periodic_task.py:210
Sep 30 08:00:11 compute-0 nova_compute[189265]: 2025-09-30 08:00:11.609 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.12/site-packages/ovs/poller.py:263
Sep 30 08:00:13 compute-0 sshd-session[233518]: Invalid user test from 103.57.223.153 port 49394
Sep 30 08:00:13 compute-0 sshd-session[233518]: pam_unix(sshd:auth): check pass; user unknown
Sep 30 08:00:13 compute-0 sshd-session[233518]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=103.57.223.153
